Featured Research

Advertisers Using WhenU

WhenU Violates Own Privacy Policy

WhenU Spams Google, Breaks "No Cloaking" Rules

Documentation of Gator Advertisements and Targeting

"Spyware": Research, Testing, Legislation, and Suits

Certifications and Site Trustworthiness

September 25 , 2006


When a stranger promises "you can trust me," most people know to be extra vigilant. What conclusion should users draw when a web site touts a seal proclaiming its trustworthiness? Some sites that are widely regarded as extremely trustworthy present such seals. But those same seals feature prominently on sites that seek to scam users -- whether through spyware infections, spam, or other unsavory practices.

It's no great surprise that bad actors seek to free-ride on sites users rightly trust. Suppose users have seen a seal on dozens of sites that turn out to be legitimate. Dubious sites can present that same seal to encourage more users to buy, register, or download.

But certification issuers don't have to let this happen. They could develop and enforce tough rules, so that every site showing a seal is a site users aren't likely to regret visiting. Unfortunately, certification don't always live up to this ideal. Writing tough rules isn't easy, and enforcing them is even harder. Hard-hitting rules are particularly unlikely when certification authorities get paid for each certification they issue -- but get nothing for rejecting an applicant.

Today I'm posting Adverse Selection in Online "Trust" Authorities, an empirical look at the best-known certification authority, TRUSTe. I cross-reference TRUSTe's ratings with the findings of SiteAdvisor -- where robots check web site downloads for spyware, and submit single-use addresses into email forms to check for spam, among other automated and manual tests. Of course SiteAdvisor data isn't perfect either, but if SiteAdvisor says a site is bad news, while TRUSTe gives it a seal, most users are likely to side with SiteAdvisor. (Full disclosure: I'm on SiteAdvisor's advisory board. But SiteAdvisor's methodology speaks for itself.)

Update (July 2009): I have posted a revised version of Adverse Selection in Online "Trust" Authorities, as published in the Proceedings of ICEC'09.

What do I find? In short, nothing good. I examine a sampling of 500,000+ top web sites, as reported by a major ISP. Of the sites certified by TRUSTe, 5.4% are untrustworthy according to SiteAdvisor's data, compared with just 2.5% untrustworthy sites in the rest of the ISP's list. So TRUSTe-certified sites are more than twice as likely to be untrustworthy. This result also holds in a regression framework controlling for site popularity (traffic rank) and even a basic notion of site type.

Particularly persuasive are some specific sites TRUSTe has certified as trustworthy, although in my experience typical users would disagree. I specifically call out four sites certified by TRUSTe as of January 2006:

This is an academic article -- ultimately likely to be a portion of my Ph.D. dissertation. So it's mathematical in places where that's likely to be helpful (to some readers, at least), and it's not as accessible as most of my work. But for those who are concerned about online safety, it may be worth a read. Feedback welcomed.


In its response to my article, TRUSTe points out that Direct Revenue and Maxmoolah no longer hold TRUSTe certifications. True. But Maxmoolah was certified for 13+ months (from February 2005 through at least March 2006), and Direct Revenue was certified for at least 8 months (from April 2005 or earlier, through at least January 2006). These companies' practices were bad all along. TRUSTe need not have certified them in the first place.

TRUSTe then claims that its own web site made an "error" in listing FunWebProducts as a member. TRUSTe does not elaborate as to how it made so fundamental a mistake -- reporting that a site has been certified when it has not. TRUSTe's FunWebProducts error was compounded by the apparent additional inclusion of numerous other near-identical Ask.com properties (Cursormania, Funbuddyicons, Historyswatter, Mymailstationery, Smileycentral, Popularscreensavers). TRUSTe's error is particularly troubling because at least some of the erroneously-listed sites were listed as certified for 17 months or longer (from May 2005 or earlier, through at least September 12, when Google last crawled TRUSTe's member list).

As to Webhancer, TRUSTe claims further tests (part of TRUSTe's Trusted Download program) will confirm the company's practices. But that's little benefit to consumers who currently see Webhancer's seal and mistakenly conclude TRUSTe has already conducted an appropriate review of Webhancer's products, when in fact it has not. Meanwhile, I have personally repeatedly observed Webhancer's bad installation practices day in and day out -- including widespread nonconsensual installations by the notorious Dollar Revenue, among others. These observations are trivial to reproduce, yet Webhancer remains a TRUSTe certificate holder to this day.

Consumers deserve certifications that are correctly issued in the first place -- not merely revoked after months or years of notorious misbehavior, and not mistakenly listed as having been issued when in fact they were not. TRUSTe is wrong to focus on the few specific examples I chose to highlight. The problem with TRUSTe's approach is more systemic, as indicated by the many other dubious TRUSTe-certified sites analyzed in my dataset but not called out by name in my paper or appendix.

Consider some of the other unsavory sites TRUSTe has certified:

TRUSTe's response claims that my conclusions somehow reflect SiteAdvisor idiosyncrasies. I disagree. I can't imagine any reasonable, informed consumer wanting to do business with sites like these. TRUSTe can do better, and in the future, I hope it will.


I'm sometimes asked where I'm headed, personally and professionally. Posting a new academic article offers an appropriate occasion to explain. I'm still working on my economics Ph.D., having drafted several papers about pay-per-click advertising (bidding strategies, efficiency, revenue comparisons), with more in the pipeline. After that? An academic job might be a good fit, though that's not the only option. Here too, I'd welcome suggestions.