IAC Toolbars and Traffic Arbitrage in 2013

Beginning in 2005, I flagged serious problems with IAC/Ask.com toolbars — including installations through security exploits and through bundles that nowhere sought user consent, installations targeting kids, rearranging users’ browsers to invite unintended searches, and showing a veritable onslaught of ads. IAC’s practices have changed in various respects, but the core remains as I previously described it: IAC’s search advertising business exists not to solve a genuine user need or provide users with genuine assistance, but to prey on users who — through inattention, inexperience, youth, or naivete — stumble into IAC’s properties.

Crucially, IAC remains substantially dependent on Google for monetization of IAC’s search services. A rigorous application of Google’s existing rules would put a stop to many of IAC’s practices, and sensible updated rules — following the stated objective of Google’s existing policies — would end much of the rest.

In this piece I examine current IAC toolbar installation practices (including targeting kids and soliciting installations when users are attempting to install security updates), the effects of IAC toolbars once installed (including excessive advertising and incomplete uninstall), and IAC’s search arbitrage business. I conclude by flagging advertisements with impermissibly large clickable areas (for both toolbars and search arbitrage), and I call on Google to put an end to Ask’s practices.

IAC Toolbar Installation

IAC’s search toolbar business is grounded in placing IAC toolbars on as many computers as possible. To that end, IAC offers 50+ different toolbars with a variety of branding — Webfetti (“free Facebook graphics”), Guffins (“virtual pet games”), religious toolbars of multiple forms (Know the Bible, Daily Bible Guide, Daily Jewish Guide), screensavers, games, and more. One might reasonably ask: Why would a user want such a toolbar?

IAC ad promises 'free online television' but actually merely links to material already on the web; promises an 'app' but actually provides a search toolbar. IAC ad promises “free online television” but actually merely links to material already on the web; promises an “app” but actually provides a search toolbar.

IAC ad solicits installations via 'virtual pet' ad distinctively catering to kids.IAC ad solicits installations via “virtual pet” ad distinctively catering to kids.

Other IAC Guffins ads specifically invite 'kids' to install. (Screenshot by iSpionage) Other IAC Guffins ads specifically invite “kids” to install. (Screenshot by iSpionage)

IAC Guffins offer features multiple animated cartoon images, distinctively catering to kids.

The Television Fanatic toolbar is instructive. IAC promotes this toolbar with search ads that promise “free online television” and “turn your computer into a TV watch full TV episode w free app.” It sounds like an attractive deal — many users would relish the ability to watch free live broadcast television on an ordinary computer, and it would not be surprising if such a service required downloading some sort of desktop application or browser plug-in. But in fact Television Fanatic offers nothing of the sort. To the extent that Television Fanatic offers the “free online television” promised in the ad, it only links to ordinary video content already provided by others. (For example, I clicked the toolbar’s “ABC” link and was taken to http://abc.go.com/watch/ — an ordinary ABC link equally available to users without Television Fanatic. That’s a far cry from IAC’s promise of special access to premium material.

Meanwhile, IAC’s Guffins toolbar distinctively targets kids. IAC promotes Guffins via search ads for terms like “virtual pet”, and the resulting ad says Guffins offers “puppy, cats, bunny, dragons & more” which a user can “feed, play, [and] care for.” The landing page features four animated animals with oversized faces and overstated features, distinctively attractive to children. Under COPPA factors or any intuitive analysis, IAC clearly targets kids. Indeed, ad tracking service iSpionage reports Guffins ads touting “Free Kids Games Download”, “Free Kids Computer Games”, “Play Kids Games Online”, and more — explicitly inviting children to install Guffins. Of course kids are ill-equipped to evaluate IAC’s offer — less likely to notice IAC’s disclosures of an included toolbar, less likely to understand what a search toolbar even is, and less able to evaluate the wisdom of installing such a toolbar in exchange for games.

While IAC’s ads often promise an “app” (including as shown in the ad screenshots at right), IAC actually offers just toolbars — add-ins appearing within web browsers, not the freestanding applications that the ads suggest. That’s all the more deceptive: IAC enticed users with the promise of genuine distinct programs offering exceptional video content and rich gaming. Instead IAC provided browser plug-ins that claim valuable screen space whenever users browse the web. And far from providing exclusive content, IAC toolbars send users to material already on the web and driving traffic to IAC’s advertising displays (as detailed in the next section). That’s strikingly inferior.

IAC’s toolbar installation practices stack up unfavorably vis-a-vis applicable Google policies, industry standards, and regulatory requirements. Google’s Software Principles call for “Upfront disclosure” with no suggestion that an app may promise one thing in an initial solicitation, then something else in a subsequent landing page. (IAC is obliged to comply with Google’s rules because IAC toolbars show ads from Google, as discussed in the next section.) Meanwhile, the Anti-Spyware Coalition specifically flags installations targeting children, allowing bundling by affiliates, and modifying browser settings as risk factors making software a greater concern. Even decades-old FTC rules are on point, disallowing “deceptive door openers” that promise one thing at the outset (like IAC’s initial promise of “free online television”) but later deliver something importantly different (a search toolbar).

Web searches reveal numerous user complaints about IAC toolbars. Consider search results for “televisionfanatic”. A first result links to product’s official site. Second is a Sitejabber forum with 20 harsh reviews. (17 reviewers gave Television Fanatic just one star out of five, with comments systematically reporting surprise and annoyance at the toolbar’s presence.) The third result advises “How to uninstall a Television fanatic toolbar”, and the fourth is multiple Yahoo Answers discussions including a user asking “Is television fanatic toolbar a virus?” and others repeatedly complaining about unintended installation. Clearly numerous users are dissatisfied with Television Fanatic.

So too for DailyBibleGuide. In a Q1 2011 earnings call, IAC CEO Greg Blatt touted the DailyBibleGuide toolbar as a new product IAC is particularly proud of. But a Google search results for “DailyBibleGuide” include a page advising “do not download Dailybiblestudy, Dailybibleguide, or Knowthebible extension.” There and elsewhere, users seem surprised to receive IAC’s toolbars. Reading users’ complaints, it seems their confusion ultimately results from IAC’s decision to deliver bible trivia via a toolbar. After all, such material would more naturally be delivered via a web page, email newsletter, or perhaps RSS feed. IAC chose the odd strategy of toolbar-based delivery not because it was genuinely what users wanted, but because this is the format IAC can best monetize. No wonder users systematically end up disappointed.

By all indications, a huge number of users are running IAC toolbars. The IAC toolbars discussed in this section all send users to mywebsearch.com, a site users are unlikely to visit except if sent there by an IAC toolbar. Alexa reports that mywebsearch.com is the #41 most popular site in the US and #71 worldwide — more popular than Instagram, Flickr, Pandora, and Hulu.

Some of IAC’s browser configuration changes remain in place even if a user removes an IAC toolbar. I installed then uninstalled an IAC Television Fanatic toolbar and received a prompt instructing “Click here for help on resetting your home page and default search settings.” The resulting page specified four different procedures totaling 16 steps — far more lengthy than the initial installation. I can see no proper reason why uninstall is so difficult. Indeed, IAC’s incomplete uninstall specifically violates Google’s October 2012 requirement that “During the uninstall process, users must be presented with a choice that gives them the option of returning their browser’s user settings to the previous settings.” Google’s Software Principles are also on point, instructing that uninstall must be “easy” and must disable “all functions of the application” — whereas IAC’s automated installer does not undo all of IAC’s changes, and IAC’s manual 16-step process is the opposite of “easy.”

The Special Problems of IAC Ask Toolbar Installed by Oracle’s Java Updates

Oracle Java security updates install Ask Toolbar by default, with just a single click in a multi-step installer. Java security update installs Ask Toolbar by default — a single click in a multi-step installer.

Ongoing Oracle Java updates also install the IAC Ask Toolbar. I discuss these installations in this separate section because they raise concerns somewhat different from the IAC toolbars discussed above. I see five key problems with Oracle Java updates that install IAC toolbars:

First, as Ed Bott noted last week, the “Install the Ask Toolbar” checkbox is prechecked, so users can install the Ask toolbar with a single click on the “Next” button. Accidental installations are particularly likely because the Ask installation prompt is step three of five-screen installation process. When installing myriad software updates, it’s easy to get into a routine of repeatedly clicking Next to finish the process as quickly as possible. But in this case, just clicking Next yields the installation of Ask’s toolbar.

Second, although the Ask installation prompt does not show a “focus” (a highlighted button designated as the default if a user presses enter), the Next button actually has focus. In testing, I found that pressing the enter or spacebar keys has the same effect as clicking “Next.” Thus, a single press of either of the two largest keys on the keyboard, with nothing more, is interpreted as consent to install Ask. That’s much too low a bar — far from the affirmative indication of consent that Google rules and FTC caselaw call for.

Third, in a piece posted today, Ed Bott finds Oracle and IAC intentionally delaying the installation of the Ask Toolbar by fully ten minutes. This delay undermines accountability, especially for sophisticated users. Consider a user who mistakenly clicks Next (or presses enter or spacebar) to install Ask Toolbar, but immediately realizes the mistake and seeks to clean his computer. The natural strategy is to visit Control Panel – Programs and Features to activate the Ask uninstaller. But a user who immediately checks that location will find no listing for the Ask Toolbar: The uninstaller does not appear until the Ask install finishes after the intentional ten minute delay. Of course even sophisticated users have no reason or ability to know about this delay. Instead, a sophisticated user would conclude that he somehow did not install Ask Toolbar after all — and only later will the user notice and, perhaps, proceed with uninstall. Half a decade ago I found WhenU adware engaged in similar intentional delay. Similarly, NYAG litigation documents revealed notorious spyware vendor Direct Revenue intentionally declining to show ads in the first day after its installation. (Direct Revenue staff said this delay would “reduce the correlation between the Morpheus download [which bundled Direct Revenue spyware] and why they are seeing [Direct Revenue’s popup] ads” — confusion that DR staff hoped would “creat[e] less of a path to what they [users] should uninstall.”) Against this backdrop, it’s particularly surprising to see IAC and Oracle adopt this tactic.

Fourth, IAC makes changes beyond the scope of user consent and fails to revert these changes during uninstall. The Oracle/IAC installation solicitation seeks permission to install an add-on for IE, Chrome, and Firefox, but nowhere mentions changing address bar search or the default Chrome search provider. Yet the installer in fact makes all these changes, without ever seeking or receiving user consent. Conversely, uninstall inexplicably fails to restore these settings. As noted above, these incomplete uninstalls violate Google’s Software Principles requirement that an “easy” uninstall must disable “all functions of the application.”

Finally, the Java update is only needed as a result of a serious security flaw in Java. It is troubling to see Oracle profit from this security flaw by using a security update as an opportunity to push users to install extra advertising software. Java’s many security problems make bundled installs all the worse: I’ve received a new Ask installation prompts with each of Java’s many security updates. (Ed Bott counts 11 over the last 18 months.) Even if the user had declined IAC’s offer on half a dozen prior requests, Oracle persists on asking — and a single slip-up, just one click or keystroke on the tenth request, will nonetheless deliver Ask’s toolbar.

A security update should never serve as an opportunity to push additional software. As Oracle knows all too well from its recent security problems, users urgently need software updates to fix serious vulnerabilities. By bundling advertising software with security updates, Oracle teaches users to distrust security updates, deterring users from installing updates from both Oracle and others. Meanwhile, by making the update process slower and more intrusive, Oracle reduces the likelihood that users will successfully patch their computers. Instead, Oracle should make the update process as quick and easy as possible — eliminating unnecessary steps and showing users that security updates are quick and trouble-free.

Toolbar Operations and Result Format

Once a user receives an IAC toolbar, a top-of-browser stripe appears in Internet Explorer and Firefox, and IAC also takes over default search, address bar search, and error handling. That’s an intrusive set of changes, and particularly undesirable in light of the poor quality of IAC’s search results.

If a user runs a search through an IAC toolbar or through a browser search function modified by IAC, the user receives Mywebsearch or Ask.com results page with advertisements and search results syndicated from Google. The volume of advertisements is remarkable: On a 800×600 monitor, the entire first two screens of Mywebsearch results presented advertisements (screen one, screen two) — four large ads with a total of seven additional miniature ads contained within. The first algorithmic search result appears on the third on-screen page, where users are far less likely to see it. At Ask.com, ads are even larger: fully seven advertisements appear above the first algorithmic result, and three more ads appear at page bottom — more than filling two 800×600 screens.

IAC obtains these advertisements and search results from Google, but IAC omits features Google proudly touts in other contexts. For example, Google claims that its maps, hotel reviews, and hotel price quotes benefit users and save users time — but inexplicably IAC Mywebsearch lacks these features, even though these features appear prominently and automatically for users who run the same search at Google. In short, a user viewing IAC results gets listings that are intentionally less useful — designed to serve IAC’s business interest in encouraging the user to click extra advertisements, with much less focus on providing the information that IAC and Google consider most useful.

The ad format at IAC Mywebsearch and Ask.com makes it particularly likely that users will mistake IAC ads for algorithmic results. For one, IAC omits any distinctive background color to help users distinguish ads from algorithmic results. Furthermore, IAC’s voluminous ads exceed beyond the first screen of results for many searches. A user familiar with Google would expect ads to have a distinctive background color and would know that ads typically rarely completely fill a screen — so seeing no such background color and similar-format results continuing for two full screens, the user might well conclude that these are algorithmic listings rather than paid advertisements.

Traffic Arbitrage

IAC buys traffic from Google and other search engines. The resulting sequence is needlessly convoluted: A user runs a search at Google, clicks an IAC ad purporting to offer what the user requested], then receives an IAC landing page with the very same ads just seen at Google. For example, I searched for [800 number look up] at Google and clicked an Ask ad. The resulting Ask page allocated most of its the above-the-fold space to three of the same ads I had just seen at Google! This process provides zero value to the user — indeed, negative value, in that the extra click adds time and confusion. But IAC monetizes its site unusually aggressively — for example, regularly putting four ads at the top of the page, where Google sometimes puts none and never presents more than three. Of course these extra ads serve IAC’s interest: By pushing a fraction of users to click multiple ads, IAC can more than cover its costs of buying the traffic from Google in the first place.

Longstanding Google rules exactly prohibit IAC’s search arbitrage. Google’s AdWords Policy Center instructs that “Google AdWords doesn’t allow the promotion of websites that are designed for the sole or primary purpose of showing ads.” Google continues: “One example of this kind of prohibited behavior is called arbitrage, where advertisers drive traffic to their websites at low cost and pay for that traffic by earning money from the ads placed on those websites”

Why isn’t Google enforcing its rules against arbitrage? An October 2012 Search Engine Land article quotes a reader who wrote to Google AdWords support, where a representative replied with unusual candor: “Since Ask.com is considered a Google product, they are able to serve ads at the top of the page when the search query is found to be relevant to their ads.” Of course Ask.com is not actually “a Google product” — it’s a Google syndicator, showing Google ads in exchange for a revenue share, just like thousands of other sites. But with IAC reportedly Google’s biggest advertising customer, special privileges would be less than surprising. Meanwhile, Google lets IAC do Google’s dirty work — showing extra ads to gullible users — which could let Google collect additional ad revenue from those users’ clicks. Still, that’s no help to users (who get pulled into extra page-views and less useful pages with more advertisements) or advertisers (whose costs increase as a result). And once the public recognizes Google’s role in authorizing this scheme, selling all advertising, and funding the entirety of IAC’s activity, Google ends up looking at least as culpable as IAC.

Ads with Oversized Clickable Areas

IAC ad promises 'free online television' but actually merely links to material already on the web; promises an 'app' but actually provides a search toolbar. Contrary to standard industry practice and Google rules, IAC makes the entire ad — including domain name, ad text, and large whitespace — into a clickable link. Notice the large clickable area flagged in the red box.

IAC ad promises 'free online television' but actually merely links to material already on the web; promises an 'app' but actually provides a search toolbar. At Google, only the ad itself is clickable. Not the much smaller red box.

IAC’s ads also flout industry practice and Google rules as to the size of an ad’s clickable area. Both in arbitrage landing pages and in toolbar results, IAC’s search result pages expand the clickable area of each advertisement to fill the entire page width, sharply increasing the fraction of the page where a click will be interpreted as a request to visit the advertiser’s page.

See the screenshot at right. (To create the red-outlined box showing the shape of the clickable area, I clicked an empty section of the ad and began a brief drag, causing my browser to highlight the ad’s clickable area in red as shown in the screenshot.)

Ask is an outlier in converting whitespace around an ad into a clickable area. Every other link on Ask.com landing pages — every link other than an advertisement — follows standard industry practice with only the words of the link being clickable, but not the surrounding whitespace. Indeed, at Google, Bing, and Yahoo, white space is never clickable. At Google and Bing, only ad titles are clickable, not ad domain names, or ad text. (See Google screenshot at right, showing the limited clickable area of a Google ad.) At Yahoo, only ad titles and domain names are clickable, not ad text or white space.

IAC has taken intentional action to expand its ads’ clickable area to cover all available width. As W3schools explains, “A block element is an element that takes up the full width available.” To expand ad hyperlinks to fill the entire width, Ask tags each ad hyperlink with the CSS STYLE of display:block.

<a id=”lindp” class=”ptbs pl20 pr30 ptsp pxl” style=”display:block;padding-bottom: 0px;” …

Google’s rules prohibit IAC’s expanded clickable areas. Google requires that “clicking on space surrounding an ad should not click the ad.” Yet IAC nonetheless makes a clickable area out of the area surrounding each ad, extending all the way to the right column.

IAC’s expanded ads invite accidental clicks. Accidental clicks are particularly likely from the inexperienced users IAC systematically targets for toolbar installations, and also from users searching on tablets, phones, and other touch devices. These extra clicks waste users’ time and drive up advertisers’ costs — but every such click yields extra revenue for IAC and Google.

What Comes Next

Google should enforce its rules strictly. No doubt IAC can offer Google some short-term revenue via extra ad-clicks from unsophisticated or confused users. But this isn’t the kind of business Google aspires to, and Google’s public statements indicate no interest in such bottom-feeding. Indeed, a fair application of Google’s existing AdWords rules would disallow both IAC’s toolbar ads (using AdWords to solicit installations) and IAC’s search arbitrage ads (using AdWords to send users to IAC pages presenting syndicated AdWords ads). Meanwhile, numerous Google AdSense rules are also on point, including prohibiting encouraging accidental clicks, prohibiting site layout that pushes content below the fold, and limiting the number of ads per page. So too for Software Principles requiring up-front disclosure as well as “easy” and complete uninstall.

As a publicly-traded company, IAC should benefit from the oversight and guidance of its outside directors. But the New York Times commented in 2011 that “IAC’s board is filled with high-powered friends of Mr. Diller,” calling into question the independence and effectiveness of IAC’s outside directors. Of particular note is Chelsea Clinton, who joined IAC’s board in September 2011. Ms. Clinton’s prior experience includes little obvious connection to Internet advertising or online business, suggesting that she might need to invest extra time to learn the details of IAC’s business. Yet she also has weighty commitments including ongoing doctoral studies, serving as an Assistant Vice Provost at NYU, and reporting as a special correspondent for the NBC Nightly News — calling into question the time she can devote to IAC matters. The Times questioned why IAC had brought in Ms. Clinton, concluding that “This is clearly an appointment made because of who she is, not what she has done.” Indeed, Ms. Clinton’s background means she will be held to a particularly high standard: if she fails to stop IAC’s bad practices, the public may reasonably ask whether she has done her duty as an outside director.

Recent research from Goldman analyst Heath Terry flags investor concerns at IAC’s tactics. In a December 4, 2012 report, Terry downgraded IAC to sell due to vulnerability from Google policy changes. A January 9, 2013 follow-up noted IAC changing its uninstall practices to comply with Google policy as well as slowdown in arbitrage. Terry flags some important factors, and I share his bottom line that IAC’s search practices are unsustainable. But the real shoe has yet to drop. If Google is embarrassed at IAC’s actions — and it should be — Google is easily able to put an end to this mess.

I prepared a portion of this article at the request of a client that prefers not to be listed by name. The client kindly agreed to let me include that research in this publicly-available posting.

A Holiday “Top 10”: Rogue Affiliates at Commission Junction and LinkShare with Wesley Brandi

Our automation continuously scours the web for rogue affiliates. In our query tool, we provide a basic sense of how much we’ve found. We have also written up scores of sample rogue affiliates, but the holiday season provides an impetus for more: Thanks to high online spending, affiliate fraud at this time of year is particularly profitable for perpetrators — and particularly costly to merchants.

In today’s article, we report the ten Commission Junction affiliates and ten LinkShare affiliates most often seen by our automation. Our findings:

Twenty Oft-Found Commission Junction and LinkShare Affiliate Violations

Google Toolbar Tracks Browsing Even After Users Choose "Disable"

Disclosure: I serve as co-counsel in unrelated litigation against Google, Vulcan Golf et al. v. Google et al. I also serve as a consultant to various companies that compete with Google. But I write on my own — not at the suggestion or request of any client, without approval or payment from any client.

Run the Google Toolbar, and it’s strikingly easy to activate “Enhanced Features” — transmitting to Google the full URL of every page-view, including searches at competing search engines. Some critics find this a significant privacy intrusion (1, 2, 3). But in my testing, even Google’s bundled toolbar installations provides some modicum of notice before installing. And users who want to disable such transmissions can always turn them off – or so I thought until I recently retested.

In this article, I provide evidence calling into question the ability of users to disable Google Toolbar transmissions. I begin by reviewing the contents of Google’s "Enhanced Features" transmissions. I then offer screenshot and video proof showing that even when users specifically instruct that the Google Toolbar be “disable[d]”, and even when the Google Toolbar seems to be disabled (e.g., because it disappears from view), Google Toolbar continues tracking users’ browsing. I then revisit how Google Toolbar’s Enhanced Features get turned on in the first place – noting the striking ease of activating Enhanced Features, and the remarkable absence of a button or option to disable Enhanced Features once they are turned on. I criticize the fact that Google’s disclosures have worsened over time, and I conclude by identifying changes necessary to fulfill users’ expectations and protect users’ privacy.

"Enhanced Features" Transmissions Track Page-Views and Search Terms

Certain Google Toolbar features require transmitting to Google servers the full URLs users browse. For example, to tell users the PageRank of the pages they browse, the Google Toolbar must send Google servers the URL of each such page. Google Toolbar’s “Related Sites” and “Sidewiki” (user comments) features also require similar transmissions.

With a network monitor, I confirmed that these transmissions include the full URLs users visit – including domain names, directories, filenames, URL parameters, and search terms. For example, I observed the transmission below when I searched Yahoo (green highlighting) for "laptops" (yellow).

GET /search?client=navclient-auto&swwk=358&iqrn=zuk&orig=0gs08&ie=UTF-8&oe=UTF-8&querytime=kV&querytimec=kV &features=Rank:SW:&q=info:http%3a%2f%2frds.yahoo.com%2f_ylt%3dA0oGkl32p1tLT2EB8ohXNyoA%2fSIG%3d18045klhr%2f
EXP%3d1264384374%2f**http%253a%2f%2fsearch.yahoo.com%2fsearch%253fp%3dlaptops%2526fr%3dsfp%2526xargs%3d12KP
jg1itSroGmmvmnEOOIMLrcmUsOkZ7Fo5h7DOV5CtdY6hNdE%25252DIfXpP0xZg6WO8T7xvSy7HBreVFdJGu277WVk0qfeG%25255FGOW%2
5255F772GnNVme5ujWkF3s%25252DJ%25255F0%25252Dmdn4RvDE8%25252E%2526pstart%3d7%2526b%3d11&googleip=O;72.14.20
4.104;226&ch=711984234986 HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; GoogleToolbar 6.4.1208.1530; Windows XP 5.1; MSIE 8.0.6001.18702)
Accept-Language: en
Host: toolbarqueries.google.com
Connection: Keep-Alive
Cache-Control: no-cache
Cookie: PREF=ID=…

HTTP/1.1 200 OK …

 

Screenshots – Google Toolbar Continues Tracking Browsing Even When Users "Disable" the Toolbar via Its "X" Button

Consistent with modern browser plug-in standards, the current Google Toolbar features an “X” button to disable the toolbar:

Google Toolbar features an

I clicked the “X” and received a confirmation window:

I chose the top option and pressed OK. The Google Toolbar disappeared from view, indicating that it was disabled for this window, just as I had requested. Within the same window, I requested the Whitehouse.gov site:

Google Toolbar disappeared from view, as instructed.  Google Toolbar seems to be disabled.

Although I had asked that the Google Toolbar be "disable[d] … for this window " and although the Google Toolbar disappeared from view, my network monitor revealed that Google Toolbar continued to transmit my browsing to its toolbarqueries.google.com server:

See also a screen-capture video memorializing these transmissions.

These Nonconsensual Transmissions Affect Important, Routine Scenarios (added 1/26/10, 12:15pm)

In a statement to Search Engine Land, Google argued that the problems I reported are "only an issue until a user restarts the browser." I emphatically disagree.

Consider the nonconsensual transmission detailed in the preceding section: A user presses "x", is asked "When do you want to disable Google Toolbar and its features?", and chooses the default option, to "Disable Google Toolbar only for this window." The entire purpose of this option is to take effect immediately. Indeed, it would be nonsense for this option to take effect only upon a browser restart: Once the user restarts the browser, the "for this window" disabling is supposed to end, and transmissions are supposed to resume. So Google Toolbar transmits web browsing before the restart, and after the restart too. I stand by my conclusion: The "Disable Google Toolbar only for this window" option doesn’t work at all: It does not actually disable Google Toolbar for the specified window, not immediately and not upon a restart.

Crucially, these nonconsensual transmissions target users who are specifically concerned about privacy. A user who requests that Google Toolbar be disabled for the current window is exactly seeking to do something sensitive, confidential, or embarrassing, or in any event something he does not wish to record in Google’s logs. This privacy-conscious user deserves extra privacy protection. Yet Google nonetheless records his browsing. Google fails this user — specifically and unambiguously promising to immediately stop tracking when the user so instructs, but in fact continuing tracking unabated.

Google Toolbar Continues Tracking Browsing Even When Users "Disable" the Toolbar via "Manage Add-Ons"

Internet Explorer 8 includes a Manage Add-Ons screen to disable unwanted add-ons. On a PC with Google Toolbar, I activated Manage Add-Ons, clicked the Google Toolbar entry, and chose Disable. I accepted all defaults and closed the dialog box.

Again I requested the whitehouse.gov site. Again my network monitor revealed that Google Toolbar continued to transmit my browsing to its toolbarqueries.google.com server. Indeed, as I requested various pages on the whitehouse.gov site, Google Toolbar transmitted the full URLs of those pages, as shown in the second and third circles below.

See also a screen-capture video memorializing these transmissions.

In a further test, performed January 23, I reconfirmed the observations detailed here. In that test, I demonstrated that even checking the "Google Toolbar Notifier BHO" box and disabling that component does not impede these Google Toolbar transmissions. I also specififically confirmed that these continuing Google Toolbar transmissions track users’ searches at competing search engines. See screen-capture video.

In my tests, in this Manage Add-Ons scenario, Google Toolbar transmissions cease upon the next browser restart. But no on-screen message alerts the user to the need for a browser-restart for changes to take effect, so the user has no reason to think a restart is required.

Google Toolbar Continues Tracking Browsing When Users "Disable" the Toolbar via Right Click (added 1/26/10 11:00pm)

Danny Sullivan asked me whether Google Toolbar continues Enhanced Features transmissions if users hide Google Toolbar via IE’s right-click menu. In a further test, I confirmed that Google Toolbar transmissions continue in these circumstances. Below are the four key screenshots: 1) I right-click in empty toolbar space, and I uncheck Google Toolbar. 2) I check the final checkbox and choose Disable. 3) Google Toolbar disappears from view and appears to be disabled. I browse the web. 4) I confirm that Google Toolbar’s transmissions nonetheless continue. See also a screen-capture video.

I right-click in empty toolbar space, and I uncheck Google Toolbar I check the final checkbox and choose Disable

Google Toolbar Disappeared from View and Seems to Be Disabled

I confirm that Google Toolbar's transmissions nonetheless continue.

Users May Easily or Accidentally Activate “Enhanced Features” Transmissions

Google Toolbar invites users to activate Enhanced Features with a single click, the default.  Also, notice self-contradictory statements (transmitting 'site' names versus full 'URL' adresses).Google Toolbar invites users to activate Enhanced Features with a single click, the default. Also, notice self-contradictory statements (transmitting ‘site’ names versus full ‘URL’ adresses).

The above-described transmissions occur only if a user runs Google Toolbar in its “Enhanced Features” mode. But it is strikingly easy for a user to stumble into this mode.

For one, the standard Google Toolbar installation encourages users to activate Enhanced Features via a “bubble” message shown at the conclusion of installation. See the screenshot at right. This bubble presents a forceful request for users to activate Sidewiki: The feature is described as “enhanced” and “helpful”, and Google chooses to tout it with a prominence that indicates Google views the feature as important. Moreover, the accept button features bold type plus a jumbo size (more than twice as large as the button to decline). And the accept button has the focus – so merely pressing Space or Enter (easy to do accidentally) serves to activate Enhanced Features without any further confirmation.

I credit that the bubble mentions the important privacy consequence of enabling Enhanced Features: “For enhanced features to work, Toolbar has to tell us what site you’re visiting by sending Google the URL.” But this disclosure falls importantly short. For one, Enhanced Features transmits not just “sites” but specific full URLs, including directories, filenames, URL parameters, and search keywords. Indeed, Google’s bubble statement is internally-inconsistent – indicating transmissions of “sites” and “URLs” as if those are the same thing, when in fact the latter is far more intrusive than the former, and the latter is accurate.

The bubble also falls short in its presentation of Google Toolbar’s Privacy Policy. If a user clicks the Privacy Policy hyperlink, the user receives the image shown in the left image below. Notice that the Privacy Policy loads in an unusual window with no browser chrome – no Edit-Find option to let a user search for words of particular interest, no Edit-Select All and Edit-Copy option to let a user copy text to another program for further review, no Save or Print options to let a user preserve the file. Had Google used a standard browser window, all these features would have been available, but by designing this nonstandard window, Google creates all these limitations. The substance of the document is also inapt. For one, “Enhanced Toolbar Features” receive no mention whatsoever until the fifth on-screen page (right image below). Even there, the first bullet describes transmission of “the addresses [of] the sites” users visit – again falsely indicating transmission of mere domain names, not full URLs. The second bullet mentions transmission of “URL[s]” but says such transmission occurs “[w]hen you use Sidewiki to write, edit, or rate an entry.” Taken together, these two bullets falsely indicate that URLs are transmitted only when users interact with Sidewiki, and that only sites are transmitted otherwise, when in fact URLs are transmitted whenever Enhanced Features are turned on.

Clicking the 'Privacy Policy' link yields this display. Note the lack of browser chrome -- no options to search text, copy to the clipboard, save, or print. Note also the absence of any on-screen mention of the special privacy concerns presented by Enhanced Features.

Clicking the ‘Privacy Policy’ link yields this display. Note the lack of browser chrome — no options to search text, copy to the clipboard, save, or print. Note also the absence of any on-screen mention of the special privacy concerns presented by Enhanced Features.
The first discussion of Enhanced Features appears five pages down.  Furthermore, the text falsely indicates that transmission covers mere domain names, not full URLs.

The first discussion of Enhanced Features appears five pages down. Furthermore, the text falsely indicates that ordinary transmission covers mere domain names, not full URLs. The text says "URL[s]" are transmitted "when you use Sidewiki" and indicates that URLs are not transmitted otherwise.

Certain bundled installations make it even easier for users to get Google Toolbar unrequested, and to end up with Enhanced Features too. With dozens of Google Toolbar partners, using varying installation tactics, a full review of their practices is beyond the scope of this article. But they provide significant additional cause for concern.

Enhanced Features: Easy to Enable, Hard to Turn Off

Google Toolbar's Options screen shows no obvious way to disable Enhanced Features.The preceding sections shows that users can enable Google Toolbar’s Enhanced Features with a single click on a prominent, oversized button. But disabling Enhanced Features is much harder.

Consider a user who changes her mind about Enhanced Features but wishes to keep Google Toolbar in its basic mode. How exactly can she do so? Browsing the Google Toolbar’s entire Options screen, I found no option to disable Enhanced Features. Indeed, Enhanced Features are easily enabled with a single click, during installation (as shown above) or thereafter. But disabling Enhanced Features seems to require uninstalling Google Toolbar altogether, and in any event disabling Enhanced Features certainly lacks any comparably-quick command.

I’m reminded of The Eagles’ Hotel California: "you can check out anytime you like, but you can never leave." And of course, as discussed above, a user who chooses the X button or Manage Add-Ons, will naturally believe the Google Toolbar is disabled, when in fact it continues transmissions unabated.

Google Toolbar Disclosures Have Worsened Over Time

Google Toolbar’s historic installer provided superior notice

I first wrote about Google Toolbar’s installation and privacy disclosures in my March 2004 FTC comments on spyware and adware. In that document, I praised Google’s then-current toolbar installation sequence, which featured the impressive screen shown at right.

I praised this screen with the following discussion:

I consider this disclosure particularly laudable because it features the following characteristics: It discusses privacy concerns on a screen dedicated to this topic, separate from unrelated information and separate from information that may be of lesser concern to users. It uses color and layout to signal the importance of the information presented. It uses plain language, simple sentences, and brief paragraphs. It offers the user an opportunity to opt out of the transmission of sensitive information, without losing any more functionality than necessary (given design constraints), and without suffering penalties of any kind (e.g. forfeiture of use of some unrelated software). As a result of these characteristics, users viewing this screen have the opportunity to make a meaningful, informed choice as to whether or not to enable the Enhanced Features of the Google Toolbar.

I stand by that praise. But six years later, Google Toolbar’s installation sequence is inferior in every way:

  • Now, initial Enhanced Features privacy disclosures appear not in their own screen, but in a bubble pitching another feature (Sidewiki). Previously, format (all-caps, top-of-page), color (red) and language ("… not the usual yada yada") alerted users to the seriousness of the decision at hand.
  • Now, Google presents Enhanced Features as a default with an oversized button, bold type, and acceptance via a single keystroke. Previously, neither option was a default, and both options were presented with equal prominence.
  • Now, privacy statements are imprecise and internally-inconsistent, muddling the concepts of site and URL. Previous disclosures were clear in explaining that acceptance entails "sending us [Google] the URL" of each page a user visits.
  • The current feature name, "Enhanced Features," is less forthright than the prior "Advanced Features" label. The name "Advanced Features" appropriately indicated that the feature is not appropriate for all users (but is intended for, e.g., "advanced" users). In contrast, the current "Enhanced Features" name suggests that the feature is an "enhancement" suitable for everyone.

Google’s Undisclosed Taskbar Button

This 'Google' button was added to my Taskbar without any notice or consent whatsoever -- highly unusual for a toolbar or any other software download.Google’s installer added this ‘Google’ button to my Taskbar without notice or consent.

The Google Toolbar also added a “Google” button to my Taskbar, immediately adjacent to the Start button. The Toolbar installer added this button without any disclosure whatsoever in the installation sequence – not on the toolbar.google.com web page, not in the installer EXE, not anywhere else.

An in-Taskbar button is not consistent with ordinary functions users reasonably expect when they seek and install a “toolbar.” Because this function arrives on a user’s computer without notice and without consent, it is an improper intrusion.

What Google Should Do

Google’s first step is simple: Fix the Toolbar so that X and Manage Add-Ons in fact do what they promise. When a user disables Google Toolbar, all Enhanced Features transmissions need to stop, immediately and without exception. This change must be deployed to all Google Toolbar users straightaway.

Google also needs to clean up the results of its nonconsensual data collection. In particular, Google has collected browsing data from users who specifically declined to allow such data to be collected. In some instances this data may be private, sensitive, or embarrassing: Savvy users would naturally choose to disable Google Toolbar before their most sensitive searches. Google ordinarily doesn’t let users delete their records as stored on Google servers. But these records never should have been sent to Google in the first place. So Google should find a way to let concerned users request that Google fully and irreversibly delete their entire Toolbar histories.

Even when Google fixes these nonconsensual transmissions, larger problems remain. The current Toolbar installation sequence suffers inconsistent statements of privacy consequences, with poor presentation of the full Toolbar Privacy Statement. Toolbar adds a button to users’ Taskbar unrequested. And as my videos show, once Google puts its code on a user’s computer, there’s nothing to stop Google from tracking users even after users specifically decline. I’ve run Google Toolbar for nearly a decade, but this week I uninstalled Google Toolbar from all my PCs. I encourage others to do the same.

Upromise Savings — At What Cost? updated January 25, 2010

Upromise touts opportunities for college savings. When members shop at participating online merchants, dine at participating restaurants, or purchase selected products at retail stores, Upromise collects commissions which fund college savings accounts.

Unfortunately, the Upromise Toolbar also tracks users’ behavior in excruciating detail. In my testing, when a user checked an innocuously-labeled box promising "Personalized Offers," the Upromise Toolbar tracked and transmitted my every page-view, every search, and every click, along with many entries into web forms. Remarkably, these transmissions included full credit card numbers — grabbed out of merchants’ HTTPS (SSL) secure communications, yet transmitted by Upromise in plain text, readable by anyone using a network monitor or other recording system.

Proof of the Specific Transmissions

I began by running a search at Google. The Upromise toolbar transmissions reported the full URL I requested, including my search provider (yellow) and my full search keywords (green).

POST /fast-cgi/ClickServer HTTP/1.0
User-Agent: upromise/3195/3195/UP23806818/0012
Host: dcs.consumerinput.com
Content-Length: 274
Connection: Keep-Alive

md5=ee593c14f70c1b7f8b3341a91c3e3639&ts=1264045792.140&bua=N%2FA&meth=get&eid=300&fid=NULL&bin=24af1f0
&refererHeader=http%3A%2F%2Fwww%2Egoogle%2Ecom%2F&url=http%3A%2F%2Fwww%2Egoogle%2Ecom%2Fsearch%3Fhl%3D
en%26source%3Dhp%26q%3Di%2Bfeel%2Bsick%26aq%3Df%26aql%26aqi%3Dg10%26oq

HTTP/1.1 200 OK
Date: Thu, 21 Jan 2010 03:49:51 GMT
Server: Apache/2.2.3 (Debian) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.3 OpenSSL/0.9.8c
Connection: close
Content-Type: text/html; charset=UTF-8

<capture>
<md5>ee593c14f70c1b7f8b3341a91c3e3639</md5>
<version>1.0</version>
</capture>

I clicked a result — a page on Wikipedia. Transmissions included the full URL of my request (blue) as well as the web search provider (yellow) and keywords (green) that had referred me (red) to this site.

POST /fast-cgi/ClickServer HTTP/1.0
User-Agent: upromise/3195/3195/UP23806818/0012
Host: dcs.consumerinput.com
Content-Length: 304
Connection: Keep-Alive

md5=ee7e3174db149d0d97f51b10db9ac58d&ts=1264045931.921&bua=N%2FA&meth=get&eid=300&fid=NULL&bin=24af1f0
&refererHeader=http%3A%2F%2Fwww%2Egoogle%2Ecom%2Fsearch%3Fhl%3Den%26source%3Dhp%26q%3Di%2Bfeel%2Bsick%
26aq%3Df%26aql%3D%26aqi%3Dg10%26oq%3D&url=http%3A%2F%2Fen%2Ewikipedia%2Eorg%2Fwiki%2FI%5FFeel%5FSick

HTTP/1.1 200 OK
Date: Thu, 21 Jan 2010 03:52:11 GMT
Server: Apache/2.2.3 (Debian) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.3 OpenSSL/0.9.8c
Connection: close
Content-Type: text/html; charset=UTF-8

<capture>
<md5>ee7e3174db149d0d97f51b10db9ac58d</md5>
<version>1.0</version>
</capture>

I browsed onwards to Buy.com (grey), where I added an item to my shopping cart and proceeded to checkout. When prompted, I entered a (made-up) credit card number. Buy.com appropriately secured the card number with HTTPS encryption. But, remarkably, Upromise extracted and transmitted the full sixteen-digit card number (yellow) — as well as my (also fictitious) CVV code (green), and expiration date (blue).

POST /fast-cgi/ClickServer HTTP/1.0
User-Agent: upromise/3195/3195/UP23806818/0012
Host: dcs.consumerinput.com
Content-Length: 1936
Connection: Keep-Alive

md5=352732ac9ee0d9e970f3c65d62ed03b1&ts=1264046115.702&bua=N%2FA&meth=post&eid=300&fid=NULL&bin=24af1f0
&refererHeader=https%3A%2F%2Fssl%2Ebuy%2Ecom%2FCO%2FCheckout%2FpaymentOptions%2Easpx&url=https%3A%2F%2F
ssl%2Ebuy%2Ecom%2FCO%2FCheckout%2FpaymentOptions%2Easpx%3F%5F%5FEVENTTARGET%26%5F%5FEVENTARGUMENT%26%5F
%5FVIEWSTATE%3D%2FwEPDwUKMTQ1MjY3MzM2Mw9kFgQCAw9kFgQCAQ8PFgIeCEltYWdlVXJsBV1odHRwczovL2EyNDguZS5ha2FtYW
kubmV0L2YvMjQ4Lzg0NS8xMGgvaW1hZ2VzLmJ1eS5jb20vYnV5X2Fzc2V0cy92Ni9oZWFkZXIvMjAwNi9idXlfbG9nby5naWZkZAICD
w8WAh8ABXdodHRwczovL2EyNDguZS5ha2FtYWkubmV0L2YvMjQ4Lzg0NS8xMGgvaW1hZ2VzLmJ1eS5jb20vYnV5X2Fzc2V0cy92Ni9j
b3JwL2NoZWNrb3V0X3Byb2Nlc3MvY2hlY2tvdXRfM19wYXltZW50X2dyZWVuLmdpZmRkAgUPZBYQAgUPZBYCZg9kFgRmDw8WBB8ABVN
odHRwczovL2EyNDguZS5ha2FtYWkubmV0L2YvMjQ4Lzg0NS8xMGgvaW1hZ2VzLmJ1eS5jb20vYnV5X2Fzc2V0cy9jcy9pbWFnZXMvYm
1sLmdpZh4HVmlzaWJsZWdkZAIBDw8WAh8ABWNodHRwczovL2EyNDguZS5ha2FtYWkubmV0L2YvMjQ4Lzg0NS8xMGgvaW1hZ2VzLmJ1e
S5jb20vYnV5X2Fzc2V0cy9idXR0b25zLzIwMDgvYnV0dG9uX2NvbnRpbnVlMi5naWZkZAIHD2QWAmYPZBYKAgUPEA9kFgIeB29uY2xp
Y2sFKFNldFVuaXF1ZVJhZGlvQnV0dG9uKCdyZG9QYXltZW50cycsdGhpcylkZGQCBw8QD2QWAh4Ib25DaGFuZ2UFG2phdmFzY3JpcHQ
6c2hvd0hpZGVDQyh0aGlzKWRkZAIJDw9kFgIfAgUfc3dpdGNoUmFkaW8oJ3Jkb05ld0NyZWRpdENhcmQnKWQCDQ8QZA8WDWYCAQICAg
MCBAIFAgYCBwIIAgkCCgILAgwWDRAFBDIwMTAFBDIwMTBnEAUEMjAxMQUEMjAxMWcQBQQyMDEyBQQyMDEyZxAFBD%2A%2A%2A%2A%7B
1422%7D%26%5F%5FEVENTVALIDATION%3D%2FwEWKQLNvonpCgKYm5r9BALB5eOPBALy2ZqvCQLv2ZqvCQLq2ZqvCQLu2ZqvCQLr2ba
vCQL28Nu2CwLDqZ7xDALDqZrxDALDqabxDALDqaLxDALDqa7xDALDqarxDALDqbbxDALDqfLyDALDqf7yDALcqZLxDALcqZ7xDALcqZ
rxDALrtZj9AgLrtaSgCQLrtbAHAuu13OoIAuu16NEPAuu19LQGAuu1gJgNAuu1rP8FAuu1%2BJcHAuu1hPsPAvai%2BtMMAvaihrcDA
vaikpoKAt7CxckKAsqi76gMAva5sdkEAvLqvYoEAoaI4c0FArr3pYIHApibrtgNijCSRDzGArpVaF4m3bbOLp8yvUM%3D%26rdoPaym
ents%3DrdoNewCreditCard%26lstCCType%3D9%26txtNewCCNumber%3D4412124112341234%26lstNewCCMonth%3D01%26lstN
ewCCYear%3D2010%26txtNewCVV2%3D222%26btnContinue2%2Ex%3D18%26btnContinue2%2Ey%3D19

HTTP/1.1 200 OK
Date: Thu, 21 Jan 2010 03:55:15 GMT
Server: Apache/2.2.3 (Debian) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.3 OpenSSL/0.9.8c
Connection: close
Content-Type: text/html; charset=UTF-8

<capture>
<md5>352732ac9ee0d9e970f3c65d62ed03b1</md5>
<version>1.0</version>
</capture>

Upromise also transmitted my email address. For example, when I logged into Restaurant.com, Upromise’s transmission included my email (yellow):

POST /fast-cgi/ClickServer HTTP/1.0
User-Agent: upromise/3195/3195/UP23806818/0012
Host: dcs.consumerinput.com
Content-Length: 378
Connection: Keep-Alive

md5=26830cfab132bb3122fcf40d8cd0f2f9&ts=1264049936.327&bua=N%2FA&meth=post&eid=303&fid=NULL&bin=24af1f0
&refererHeader=&url=https%3A%2F%2Fwww%2Erestaurant%2Ecom%2Fregister%2Dlogin%2Easp%3Fpurchasestatus%3DRE
GISTER%2DLOGIN%26hdnButton%3DSignin%26txtMemberSignin%3Dedelman%40pobox%2Ecom%26radio%5Fcust%3Dcustomer
%5Fexisting%26txtMemberPassword…

HTTP/1.1 200 OK
Date: Thu, 21 Jan 2010 04:58:56 GMT
Server: Apache/2.2.3 (Debian) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.3 OpenSSL/0.9.8c
Connection: close
Content-Type: text/html; charset=UTF-8

<capture>
<md5>26830cfab132bb3122fcf40d8cd0f2f9</md5>
<version>1.0</version>
</capture>

All the preceding transmission were made over my ordinary Internet connection just as shown above. In particular, these transmissions were sent in plain text — without encryption or encoding of any kind. Any computer with a network monitor (including, for users connected by Wi-Fi, any nearby wireless user) could easily read these communications. With no additional hardware or software, a nearby listener could thereby obtain, e.g., users’ full credit card numbers — even though merchants used HTTPS security to attempt to keep those numbers confidential.

The Destination of Upromise’s Transmissions: Compete, Inc.

As shown in the "host:" header of each of the preceding communications, transmissions flow to the consumerinput.com domain. Whois reports that this domain is registered to Boston, MA traffic-monitoring service Compete, Inc. Compete’s site promises clients access to "detailed behavioral data," and Compete says more than 2 million U.S. Internet users "have given [Compete] permission to analyze the web pages they visit."

Upromise’s Disclosures Misrepresent Data Collection and Fail to Obtain Consumer Consent

Upromise’s installation sequence does not obtain users’ permission for this detailed and intrusive tracking. Quite the contrary: Numerous Upromise screens discuss privacy, and they all fail to mention the detailed information Upromise actually transmits.

The Upromise toolbar installation page touts the toolbar’s purported benefits at length, but mentions no privacy implications whatsoever.

If a user clicks the prominent button to begin the toolbar installation, the next screen presents a 1,354-word license agreement that fills 22 on-screen pages and offers no mechanism to enlarge, maximize, print, save, or search the lengthy text. But even if a user did read the license, the user would receive no notice of detailed tracking. Meanwhile, the lower on-screen box describes a "Personalized Offers" feature, which is labeled as causing "information about [a user’s] online activity [to be] collected and used to provide college savings opportunities" But that screen nowhere admits collecting users’ email addresses or credit card numbers. Nor would a user rightly expect that "information about … online activity" means a full log of every search and every page-view across the entire web.

The install sequence does link to Upromise’s privacy policy. But this page also fails to admit the detailed tracking Upromise performs. Indeed, the privacy policy promises that Personalized Offers data collection will be "anonymous" — when in fact the transmissions include email addresses and credit card numbers. The privacy policy then claims that any collection of personal information is "inadvertent" and that such information is collected only "potentially." But I found that the information transmissions described above were standard and ongoing.

The privacy policy also limits Upromise’s sharing of information with third parties, claiming that such sharing will include only "non-personally identifiable data." But I found that Upromise was sharing highly sensitive personal information, including email addresses and credit card numbers.

In addition, Upromise’s data transmissions contradict representation in Upromise’s FAQ. The top entry in the FAQ promises that Upromise "has implemented security systems designed to keep [users’] information safe." But transmitting credit card numbers in cleartext is reckless and ill-advised — specifically contrary to standard Payment Card Industry (PCI) rules, and the opposite of "safe." Indeed, in an ironic twist, Upromise’s FAQ goes on to discuss at great length the benefits of SSL encryption — benefits of course lacking when Upromise transmits users’ credit card numbers without such encryption.

The Upromise toolbar offers an Options screen which again makes no mention of key data transmissions. The screen admits collecting "information about the web sites you visit." But it says nothing of collecting email addresses, credit card numbers, and more.

The Scope and Solution

The transmissions at issue affect only those users who agree to run Upromise’s Personalized Offers system. But until last week, that option was set as the default upon new Upromise toolbar installations — inviting users to initiate these transmissions without a second look.

Even if Upromise ceased the most outrageous transmissions detailed above, Upromise’s installation disclosures would still give ample basis for concern. To describe tracking, transmitting, and analyzing a user’s every page-view and every search, Upromise’s install screen euphemistically mentions that its "service provider may use non-personally identifiable information about your online activity." This admission appears below a lengthy EULA, under a heading irrelevantly labeled "Personalized Offers" — doing little to draw users’ attention to the serious implications of Personalized Offers. I don’t see why Upromise users would ever want to accept this detailed tracking: It’s entirely unrelated to the college savings that draws users to the Upromise site. But if Upromise is to keep this function, users deserve a clear, precise, and well-labeled statement of exactly what they’re getting into.

Upromise’s multi-faceted business raises other concerns that also deserve a critical look. The implications for affiliate merchants are particularly serious — a risk of paying affiliate commission for users already at a merchant’s site. But I’ll save this subject for another day.

Upromise’s Response (posted: January 23, 2010)

Upromise told PC Magazine’s Larry Seltzer that less than 2% of their 12 million members were affected. Though 2% of 12 million is 240,000 — a lot!

Upromise staff also wrote to me. I replied with a few questions:

1) How long was the problem in effect?

2) How many users were affected?

3) What caused the problem?

4) I notice that the Upromise toolbar EULA has numerous firmly-worded defensive provisions. (See the entire sections captioned “Disclaimer of Warranty” and “Limitation of Liability” – purporting to disclaim responsibility for a wide variety of problems. And then see Governing Law and Arbitration for a purported limitation on what law governs and how claims may be brought.) For consumers who believe they suffered losses or harms as a result of this problem, will Upromise invoke the defensive provisions in its EULA to attempt to avoid or limit liability?

I’m particularly interested in the answer to the final question. The EULA is awfully one-sided. I appreciate Upromise confirming the serious failure I uncovered. Upromise should also accept whatever legal liability goes with this breach.

Upromise’s Response to My Questions (posted: January 25, 2010)

Upromise replied to my four questions to indicate that the scope of the problem was "approximately 1% of our 12 million members." Upromise did not answer my questions about the duration of the problem, the cause of the problem, or the effect of Upromise’s EULA defenses.

Google Click Fraud Inflates Conversion Rates and Tricks Advertisers into Overpaying

I’ve repeatedly reported improper placements of Google ads. In most of my write-ups, the impropriety occurs in ad placement — Google PPC ads shown in spyware popups (1, 2, 3, 4), in typosquatting sites (1, 2), or in improperly-installed and/or deceptive toolbars (1, 2). This article is different: Here, the impropriety includes a fake click — click fraud — charging an advertiser for a PPC click, when in fact the user never actually clicked.

But this is no ordinary click fraud. Here, spyware on a user’s PC monitors the user’s browsing to determine the user’s likely purchase intent. Then the spyware fakes a click on a Google PPC ad promoting the exact merchant the user was already visiting. If the user proceeds to make a purchase — reasonably likely for a user already intentionally requesting the merchant’s site — the merchant will naturally credit Google for the sale. Furthermore, a standard ad optimization strategy will lead the merchant to increase its Google PPC bid for this keyword on the reasonable (albeit mistaken) view that Google is successfully finding new customers. But in fact Google and its partners are merely taking credit for customers the merchant had already reached by other methods.

In this piece, I show the details of the spyware that tracks user browsing and fakes Google PPC ad clicks, and I identify the numerous intermediaries that perpetrate these improper charges. I then criticize Google’s decision to continue placing ads through InfoSpace, the traffic broker that connected Google to this click fraud chain. I consider this practice in light of Google’s advice to advertisers and favored arguments that click fraud problems are small and manageable. Finally, I propose specific actions Google should take to satisfy to prevent these scams and to satisfy Google’s obligations to advertisers.

Introducing the Problem: A Reader’s Analogy

Reading a prior article on my site, a Register discussion forum participant offered a useful analogy:

Let’s say a restaurant decides [it] wants someone to hand out fliers … so they offer this guy $0.10 a flier to print some and distribute them.

The guy they hire just stands at the front door and hand the fliers to anyone already walking through the door.

Restaurant pays lots of money and gains zero customers.

Guy handing out the fliers tells the owner how many fliers were printed and compares that to how many people bring the fliers into his restaurant.

The owner thinks the fliers are very successful and now offers $0.20 for each one.

It’s easy to see how the restaurant owner could be tricked. Such scams are especially easy in online advertising — where distance, undisclosed partnerships, and general opacity make it far harder for advertisers to figure out where and how Google and its partners present advertisers’ offers.

Google and Its Partners Covering Advertisers’ Sites with Spyware-Delivered Click-Fraud Popups

PPC advertisers (e.g. Finish Line)
money viewers
   Google   
money viewers
InfoSpace
money viewers
Cheapstuff
money viewers
Adfirmative
money viewers
dSide Marketing
money viewers
Netaxle
money viewers
eWoss
money viewers
AdOn Network
money viewers
Trafficsolar

The money trail – how funds flow from advertisers to Google to Trafficsolar spyware.

In testing of December 31, 2009, my Automatic Spyware Advertising Tester browsed Finishline.com, a popular online shoe store, on a virtual computer infected with Trafficsolar spyware (among other advertising software, all installed through security exploits without user consent). Trafficsolar opened a full-screen unlabeled popup, which ultimately redirected back to Finish Line via a fake Google PPC click (i.e., click fraud).

My AutoTester preserved screenshots, video, and packet log of this occurrence. The full sequence of redirects:

Trafficsolar opens a full-screen popup window loading from urtbk.com, a redirect server for AdOn Network. (AdOn, of Tempe, Arizona, first caught my eye when it boasted of relationships with 180solutions/Zango and Direct Revenue. NYAG documents later revealed that AdOn distributed more than 130,000 copies of Direct Revenue spyware. More recently, I’ve repeatedly reported AdOn facilitating affiliate fraud, inflating sites’ traffic stats, and showing unrequested sexually-explicit images.)

AdOn redirects to eWoss. (eWoss, of Overland Park, Kansas, has appeared in scores of spyware popups recorded by my testing systems.)

eWoss redirects to Netaxle. (NetAxle, of Prairie Village, Kansas, has also appeared in numerous popups — typically, as here, brokering traffic from eWoss.)

Netaxle redirects to dSide Marketing. (dSide Marketing, of Montreal, Canada, says it provides full-service SEO and SEM services.)

dSide Marketing redirects to Adfirmative. (Adfirmative, of Austin, Texas, promises “click-fraud protected, targeted advertising” and “advanced click-fraud prevention.”)

Adfirmative redirects to Cheapstuff. (Cheapstuff fails to provide an address on its web site or in Whois, though its posted phone number is in Santa Monica, California. Cheapstuff’s web site shows a variety of commercial offers with a large number of advertisements.)

Cheapstuff redirects to InfoSpace. (InfoSpace, of Bellevue, Washington, is discussed further in the next section.)

InfoSpace redirects to Google, which redirects through DoubleClick and onwards back to Finish Line — the same site my tester had been browsing in the first place.

This placement is a bad deal for Finish Line for at least two reasons. First, Google charges Finish Line a fee to access a user already at Finish Line’s site. But that’s more of a shake-down then genuine advertising: an advertiser should not have to pay to reach a user already at its site. Furthermore, Google styles its advertising as “pay per click”, promising advertisers that “You’re charged only if someone clicks your ad.” But here, the video and packet log clearly confirm that the Google click link was invoked without a user even seeing a Google ad link, not to mention clicking it. Advertisers paying high Google prices deserve high-quality ad placements, not spyware popups and click fraud.

Finally, the popup lacks the labeling specifically required by FTC precedent. Consistent with FTC’s settlement in its Direct Revenue and Zango cases, every spyware/adware popup must be labeled with the name of the program that caused the popup, along with uninstall instructions. Furthermore, the FTC has taken an appropriately dim view of advertising software installed on users’ computers without user consent. But every single Trafficsolar installation I’ve ever seen has arrived on my test computers through security exploits, without consent. For these reasons, this Trafficsolar-Google popup clearly falls afoul of applicable FTC requirements.

Critiquing InfoSpace’s role

As shown in the prior section and diagram, traffic flows through a remarkable seven intermediaries en route from Trafficsolar spyware to the victim Google advertiser. Looking at such a lengthy chain, the problem may seem intractable: How could Google effectively supervise a partner’s partner’s partner’s partner’s partner’s partner’s partner’s partner? That insurmountable challenge is exactly why Google should never have gone down this path. Instead, Google should place ads only through the companies with which Google has direct relationships.

In this instance, when traffic finally gets to Google, it comes through a predictable source: InfoSpace. It was InfoSpace, and InfoSpace alone, that distributed Google ads into the morass of subsyndicators and redistributors detailed above.

Flipping through my records of prior InfoSpace observations, I was struck by the half-decade of bad behavior. Consider:

June 2005: I showed InfoSpace placing Google ads into the IBIS Toolbar which, I demonstrated in multiple screen-capture videos, was arriving on users’ computers through security exploits (without user consent). The packet log revealed that traffic flowed from IBIS directly to InfoSpace’s Go2net.com — suggesting that InfoSpace had a direct relationship with IBIS and paid IBIS directly, not via any intermediary.

August 2005: I showed InfoSpace placing ads through notorious spyware vendor Direct Revenue (covering advertisers’ sites with unlabeled popups presenting their own PPC ads). The packet log revealed that traffic flowed from Direct Revenue directly to InfoSpace — suggesting that InfoSpace had a direct relationship with Direct Revenue and paid Direct Revenue directly, not via any intermediary.

August 2005: I showed InfoSpace placing ads through notorious spyware vendor 180solutions/Zango. The packet log revealed that traffic flowed from 180solutions directly to InfoSpace — suggesting that InfoSpace had a direct relationship with 180solutions and paid 180solutions directly, not via any intermediary.

February 2009: I showed InfoSpace placing Google ads into WhenU popups that covered advertisers’ sites with their own PPC ads.

May 2009: Again, I showed InfoSpace using WhenU to cover advertisers’ sites with their own PPC ads, through partners nearly identical to the February report.

January 2010 (last week): I showed InfoSpace’s still placing Google ads into WhenU popups and still covering advertisers’ sites with their own PPC ads.

And those are just placements I happened to write up on my public site! Combine this pattern of behavior with InfoSpace’s well-documented accounting fraud, and InfoSpace hardly appears a sensible partner for Google and the advertisers who entrust Google to manage their spending.

Nor can InfoSpace defend this placement by claiming Cheapstuff looked like a suitable place to show ads. The Cheapstuff site features no mailing address or indication of the location of corporate headquarters. WHOIS lists a “privacy protection” service in lieu of a street address or genuine email address. These omissions are highly unusual for a legitimate advertising broker. They should have put InfoSpace and Google on notice that Cheapstuff was up to no good.

This Click Fraud Undercuts Google’s Favorite Defense to Click Fraud Complaints

When an advertiser buys a pay-per-click ad and subsequently makes a sale, it’s natural to assume that sale resulted primarily from the PPC vendor’s efforts on the advertiser’s behalf. But the click fraud detailed in this article takes advantage of this assumption by faking clicks to target purchases that would have happened anyway. Then, when advertisers evaluate the PPC traffic they bought, they overvalue this “conversion inflation” traffic — leading advertisers to overbid and overpay.

Indeed, advertisers’ following Google’s own instructions will fall into the overbidding trap. Discussing “traffic quality” (i.e. click fraud and similar schemes),Google tells advertisers to “track campaign performance” for “ROI monitoring.” That is, when an advertiser sees a Google ad click followed by a sale, the advertiser is supposed to conclude that ads are working well and delivering value, and that click fraud is not a problem. Google’s detailed “Click Fraud: Anecdotes from the Front Line” features a similar approach, advising that “ROI is king,” again assuming that clicks that precede purchases must be valuable clicks.

Google’s advice reflects an overly optimistic view of click fraud. Google assumes click fraudsters will send random, untargeted traffic. But click-frauders can monitoring user activities to identify the user’s likely future purchases, just as Trafficsolar does in this example. Such a fraudster can fake the right PPC clicks to get credit for traffic that appears to be legitimate and valuable — even though in fact the traffic is just as worthless as other click fraud.

What Google Should Do

Google’s best first step remains as in my posting last week: Fire InfoSpace. Google doesn’t need InfoSpace: high-quality partners know to approach Google directly, and Google does not need InfoSpace to add further subpartners of its own.

Google also needs to pay restitution to affected advertisers. Every time Google charges an advertiser for a click that comes from InfoSpace, Google relies on InfoSpace’s promise that the click was legitimate, genuine, and lawfully obtained. But there is ample reason to doubt these promises. Google should refund advertisers for corresponding charges — for all InfoSpace traffic if Google cannot reliably determine which InfoSpace traffic is legitimate. These refunds should apply immediately and across-the-board — not just to advertisers who know how to complain or who manage to assemble exceptional documentation of the infraction.

More generally, Google must live up to the responsibility of spending other people’s money. Through its Search Network, Google takes control of advertisers’ budgets and decides, unilaterally, where to place advertisers’ ads. (Indeed, for Search Network purchases, Google to this day fails to tell advertisers what sites show their ads. Nor does Google allow opt-outs on a site-by-site basis — policies that also ought to change.) Spending others’ money, wisely and responsibly, is a weighty undertaking. Google should approach this task with significantly greater diligence and care than current partnerships indicate. Amending its AdWords Terms and Conditions is a necessary step in this process: Not only should Google do better, but contracts should confirm Google’s obligation to offer refunds when Google falls short.

I’m disappointed by Google’s repeated refusal to take the necessary precautions to prevent these scams. InfoSpace’s shortcomings are well-known, longstanding, and abundantly documented. What will it take get Google to eject InfoSpace and protect its advertisers’ budgets?

Google Still Charging Advertisers for Conversion-Inflation Traffic from WhenU Spyware updated January 7, 2010

When an advertiser buys a pay-per-click ad and subsequently makes a sale, it’s natural to assume that sale resulted primarily from the PPC vendor’s efforts on the advertiser’s behalf. But tricky PPC platforms take advantage of this assumption by referring purchases that would have happened anyway. Then, when advertisers evaluate the PPC traffic they bought, they overvalue this “conversion inflation” traffic — leading advertisers to overbid and overpay.

In this piece, I show Google and its partners still covering popular sites with PPC advertisements promoting those same sites. I present the role of InfoSpace, the Google partner at the core of these misplacements, and I argue that Google should long ago have severed its ties to InfoSpace. I cite specific Google promises that these placements violate, and I critique Google’s contractual disclaimers that claim advertisers must pay for these bogus placements. Finally, I propose specific actions Google should take to satisfy to its obligations to advertisers.

Google and Its Partners Still Covering Advertisers’ Sites with Spyware-Delivered Popups

WhenU covers Continental with its own Google ads -- charging ad fees for traffic Continental would otherwise receive for free
WhenU covers Continental with its own Google ads — chargingad fees for traffic Continental would otherwise receive for free

As shown in the thumbnail at right and detailed in screenshots, video, and packet log, WhenU continues to cover web sites with PPC popups. Crucially, those popups show Google ads — often promoting the very same sites users are already browsing.

In the example shown at right, I browsed the Continental Airlines site. WhenU opened the popup shown at right — covering the Continental site with a list of Google ads, putting a prominent Continental ad front-and-center. Thus, Google charges Continental a fee to access a user already at Continental’s site. That’s a rotten deal for Continental: For one, an advertiser should not have to pay to reach a user already at its site. Furthermore, advertisers paying high Google prices deserve high-quality ad placements, not spyware popups.

The details of the Continental ad, as shown in the WhenU-Google popup, further entice users to click. The ad promises a “low fare guarantee” — suggesting that users who book some other way (without clicking the ad) may not enjoy that guarantee. And the ad promises to take users to the “official site” — suggesting that users who don’t click the ad will book through a site that is less than official. In fact both suggestions are inaccurate, but a reasonable user would naturally reach these conclusions based on the wording of the advertisement and the context of its appearance.

The WhenU-Google popup lacks the labeling specifically required by FTC policy. In particular, all sponsored search ads are to be labeled as such, pursuant to the FTC ‘s 2002 instructions. But look closely at the popup screenshot. On my ordinary 800×600 screen, no such label appears. I gather the required label would ordinarily appear on a sufficiently large screen, but the FTC’s policies make no exceptions for users with small to midsized screens. Indeed, as netbooks gain popularity, small screens are increasingly common.

The diagram below (left) confirms the specific intermediaries passing traffic from WhenU to Google in this instance.

The money trail: how funds flow from advertisers to Google to WhenU
(three examples persisting over ten months)
December 2009

PPC advertisers
(e.g. Continental)
money viewers
   Google   
money viewers
InfoSpace
money viewers
LocalPages
money viewers
(unknown company*)
money viewers
WhenU

PPC advertisers
(e.g. RCN)
money viewers
   Google   
money viewers
InfoSpace
money viewers

*  LocalPages
money viewers
Nbcsearch
money viewers
LocalPages

money viewers
WhenU

PPC advertisers
(e.g. Verizon)
money viewers
   Google   
money viewers
InfoSpace
money viewers
LocalPages
money viewers
WhenU

This observation marks the third sequence by which I have observed Google paying WhenU to cover advertisers’ sites with the advertisers’ own Google ads. The center and right diagrams (above) show the intermediaries in my May 2009 and February 2009 observations of similar placements.

The Impropriety of Google’s Relationship with InfoSpace

In all three instances I reported (as summarized in the diagram above), Google’s closest link is to InfoSpace. That is, Google pays InfoSpace, and InfoSpace pays the various entities that follow. In my view, Google’s relationship with InfoSpace is ill-advised for at least three reasons:

First, InfoSpace has a track record of improper placements of Google ads. InfoSpace is implicated in all three of the placements detailed above — misplacements that have continued over a lengthy period despite ample notice and opportunity for correction. Furthermore, I have personally observed other improper placements by InfoSpace. (Perhaps I’ll post more in a futher piece.) Google need not continue to do business with a distributor with such a poor track record.

Second, Google does not need a distributor whose business model entails farming out ad placements to subdistributors. If InfoSpace’s subdistributors seek to distribute Google ads, and to be paid for doing so, let them apply directly to Google and undergo Google’s ordinary quality control and oversight. Inserting InfoSpace as an additional intermediary serves only to lessen accountability.

Third, InfoSpace’s corporate history undermines any request for lenience or forgiveness. The Seattle Times chronicles InfoSpace’s accounting fraud in a three-part investigative report, “Dot-Con Job“, presenting 12,000+ words of analysis as well as primary source documents and even voicemail recordings. The Seattle Times byline summarizes their findings: “Investors were cashing out millions, and faithful investors were left with pennies.” Hardly a mark of trustworthiness!

These Ads Violate Google’s Promises to Users

These ad placements fall short of Google’s promises to users. By paying spyware vendors to show advertisements, Google both enlarges and prolongs the spyware problem. In particular, Google’s funding supports software that users struggle to remove from their computers. Google’s payments make it more profitable for vendors to sneak such software onto users’ computers in the first place.

Furthermore, Google’s Software Principles specifically disallow WhenU’s practices. Google’s “installation” and “upfront disclosure” principles disallow deceptive and nonconsensual WhenU installations. (I have video proof on file showing nonconsensual WhenU installations.) Google’s prohibition on “snooping” prohibits certain WhenU privacy practices, including WhenU’s historic violation of its own privacy policy (transmitting full page URLs despite a privacy policy promising “As the user surfs the Internet, URLS visited by the user … are NOT transmitted to WhenU.com or any third party server”).

Crucially, Google’s partnership with WhenU directly contradicts Google’s call for software makers and advertising intermediaries to “keep[] good company” by supervising partners. Despite that commitment, present on Google’s site for 4+ years, Google inexplicably continues its relationship with WhenU.

These Ads Violate Google’s Promises to Advertisers

These ad placements also fall short of Google’s obligations to advertisers. For example, when Google describes its Search Network, Google promises:

Ads are targeted based on a user’s search terms.   (emphasis added)

But here, the user performed no search — so there was no proper cause to display a Search Network ad or charge an advertiser a high Search Network price.

Google confirms:

On the Search Network, ads are shown … on … the search results pages of … Google’s search partners … within the Search Network. On our search partners, your ads may appear alongside or above search results, as part of a results page as a user navigates through a site’s directory, or on other relevant search pages.   (emphasis added)

A placement through a spyware popup does not meet these criteria: A spyware popup is not a “page.” Furthermore, a user browsing an ordinary web site (like the Continental site shown above) is neither “search[ing]” nor navigating a “directory,” contrary to Google’s promise that search ads are shown to users at search engines and directories.

Despite these clear promises, Google’s AdWords Terms and Conditions purport to allow these placements and any others Google might choose to foist on unsuspecting advertisers. Google requires advertisers to accept the following form contract provisions:

Customer understands and agrees that ads may be placed on … (z) any other content or property provided by a third party (‘Partner’) upon which Google places ads (‘Partner Property’).   (emphasis added)

That’s circular, uninformative, and a rotten deal. Advertisers should demand better. Nor should Google’s fine print claim the right to impose such bogus charges. Google should amend its contract to disavow charges from spyware, adware, conversion-inflation, and other schemes contrary to Google’s affirmative promises.

What Google Should Do

Google’s first step is easy: Fire InfoSpace. Google doesn’t need InfoSpace, and there’s zero reason for this relationship to continue in light of InfoSpace’s repeated failings.

Google also needs to pay restitution to affected advertisers. Every time Google charges an advertiser for a click that comes from InfoSpace, Google relies on InfoSpace’s promise that the click was legitimate, genuine, and lawfully obtained. But there is ample reason to doubt these promises. Google should refund advertisers for corresponding charges — for all InfoSpace traffic if Google cannot reliably determine which InfoSpace traffic is legitimate. These refunds should apply immediately and across-the-board — not just to advertisers who know how to complain or who manage to assemble exceptional documentation of the infraction. (Indeed, in response to my May 2009 report, I know Google provided a credit to RCN — the specific advertiser whose targeting I happened to feature in my example. But I gather Google failed to provide automatic credits to all affected advertisers, even though Google’s billing records provide ample documentation of which advertisers faced charges from which Google partners. And I understand that Google denied requests for refunds or credits from other affected advertisers.)

More generally, Google must live up to the responsibility of spending other people’s money. Through its Search Network offering, Google takes control of advertisers’ budgets and decides, unilaterally, where to place advertisers’ ads. (Indeed, for Search Network purchases, Google to this day fails to tell advertisers what sites show their ads. Nor does Google allow opt-outs on a site-by-site basis — policies that also ought to change.) Spending others’ money, wisely and responsibly, is a weighty undertaking. Google should approach this task with significantly greater diligence and care than current partnerships indicate. Amending its AdWords T&C’s is a necessary step in this process: Not only should Google do better, but contracts should confirm Google’s obligation to offer refunds when Google falls short.

I’m disappointed by how little has changed since my year-ago reports of these same practices. In a conference presentation in February 2009, I demonstrated substantially similar WhenU placements, with Google’s Rose Hagan (Senior Trademark Counsel) present in the audience. In May 2009 I wrote up these WhenU placements on my web site in great detail. Yet ten months later, the problem continues unabated. Indeed, the other misplacements I identified in May 2009 also continue: Google continues partnering with IAC SmileyCentral (deceptive browser plug-ins that induce searches when users attempt navigations), placing ads on typosquatting sites (including sites that show a company’s own ads when users mistype that company’s domain name), and, through Google Chrome, inviting users to search (and click prominent top-of-page ads) when direct navigation would better satisfy users’ requests and avoid unnecessary advertising costs for advertisers. I’m disappointed by the lack of progress when, in each instance, the improper charges are clear and well-documented. Google’s intransigence confirms the need for the Bill of Rights for Online Advertisers I proposed this fall.

In Support of Utah’s HB450

When a user searches for Hertz, may a search engine show ads for Avis instead?* A natural libertarian instinct might reply yes, sure, do whatever you want. I want to push back on that, offering reasons why such ads are improper.

Modern search engines are notable for their striking ability to give users exactly what they ask for. Search for Hertz, and most of the links will indeed take you to Hertz or bona fide Hertz-related sites (like booking agents or consumer reviews). In this context, what is a user to think when a search engine serves up an ad for something altogether different from a user’s request? Because search engines are generally so good at providing just what users requested, there’s likely user confusion any time a search engine instead replies with links to competitors. After all, if a user asked for Hertz, it’s perfectly reasonable for the user to expect that resulting links will be responsive to the user’s request.

Now, search engines often say their ad labels cure any possible use confusion. I disagree. For one, the labels are easily overlooked — all the way off to the side, all the way in the corner. Moreover, while the words “sponsored links” may be clear to an attorney or an advertising professional, I’ve found that the wording is deeply ambiguous to ordinary users. Sponsored by whom? The search engine? The company the user just asked for? A different label, like “advertisements” or “paid advertisements” would be more effective in curing confusion. But that’s not on the table.

Meanwhile, litigation does not lend itself to resolving these questions. Consider typical litigation about these ads: Blow up an exemplar onto a big posterboard, analyze it from every angle, and discuss it for days on end. The very process of litigating the case makes it amply clearly what’s going on. So it’s hard for a court to get into the mindset of an ordinary user who’s confused, who didn’t know what “sponsored links” meant, and who didn’t really see that label in any case. In this context, it comes as no great surprise that US courts reach mixed results on the question of whether a search engine may show ads for one company when a user requests a direct competitor. European courts, for whatever it’s worth, tend to say search engines must not do so.

Search engines also often claim users benefit from ads for competitors. I guess it’s possible that some users might search for Hertz, not knowing that Avis even exists. But how many users does this really describe? If a consumer actually wants offers from multiple providers, those are easy to get; just search for “car rental” or “rental car deals” to get plenty of choices. In contrast, as described above, when a user searches for a specific provider, competitors’ ads are more likely to be confusing, and less likely to be useful.

Despite lofty claims about consumer benefits, I’ve always thought search engines let advertisers bid on each others’ trademarks for one simple reason: Money. If the only advertiser allowed to bid on ads for “Hertz” is Hertz, a search engine won’t be able to sell many ads. (They’ll sell at most one, to Hertz. But even that one will garner a low price, reflecting that Hertz did not have to outbid anyone else. Furthermore, why should Hertz buy an ad for its own trademark, when it already gets top position through organic listings?) In contrast, if a search engine can get ten different car rental advertisers competing for slots, revenues will increase dramatically. (See my revenue analyses through simulations and counterfactuals.) Now, I don’t mean to say increasing revenue isn’t a laudable goal for search engines. But the financial implications frame my assessment of search engines’ arguments. They say “consumers” and “competition”; I hear “revenues” and “profits.”

The HB450 Approach

Against that backdrop, Utah offers HB450 which seeks to provide an alternative. In those narrow circumstances when Utah has proper cause to regulate — among the key conditions, an advertiser using a search service that knows users are in Utah — Utah would require that advertisers not trigger ads based on competitors’ trademarks. The results? Less confusion for consumers who just want to get what they asked for. Plus, companies can reap where they’ve sowed. If a company invests in offline advertising (like ads on TV or in newspapers) to get users to search for its brand, those searches will show the company’s ads, not offers from competitors. It’s a perfectly natural, sensible approach.

Indeed, HB450 is a narrow approach. HB450 imposes no possible liability on search engines, no matter what. Rather, HB450 applies only to advertisers. Furthermore, an advertiser’s duty under HB450 is only to take down the offending ads, and even that only after notice. In addition, HB450 grants a successful plaintiff no monetary damages; HB450 allows only an injunction requiring that a defendant take down the offending ads, and attorneys fees to cover the cost of the action, but no further payments. In short, HB450 uses a minimalist approach, grounded in private-sector self-regulation and companies notifying each other of ads they believe cross the line. Far from the intrusive morass Eric Goldman seems to envision, this is sensible and appropriate, protecting consumers from confusing or deceptive ads, and protecting advertisers from competitors trading on their good names.

Nor is HB450 any kind of comprehensive Internet regulation, as AT&T spokesman claimed in statements to ClickZ. Trademark law and consumer protection are both traditional subjects of state regulation, and there’s no reason why states’ advertising regulations shouldn’t apply online too — particularly as geolocation systems become increasingly widespread and as it therefore becomes feasible, indeed easy and the norm, to present ads differently in one state versus in others.

In due course, I’d like to see federal regulation expand HB450 to national scope. After all, HB450’s protections ought not be limited to consumers and advertisers in Utah, and it would be perfectly natural to offer HB450 nationwide. But it’s perfectly normal for new regulatory approaches to begin in individual states — letting experience in a few states guide the decision to expand more broadly. That’s an appropriate approach here, and my hope is that that’s what will happen.

 

* – My Hertz/Avis example is purely hypothetical. While many advertisers ads targeting competitors’ trademarks, I do not mean to suggest that Avis does so.

Auditing Spyware Advertising Fraud: Wasted Spending at VistaPrint

“VistaPrint is disciplined in operation … [VistaPrint’s] marketing [uses] highly analytically driven fact-based decision-making … [W]e manage those [marketing partners] tightly.”

– VistaPrint CEO Robert Keane in a January 2008 earnings call

For more than four years, I’ve been monitoring online advertising — alerting advertisers, ad networks, and the general public when ad spending finds its way to spyware vendors and when advertisers are getting cheated. (Examples: 1, 2, 3, 4, 5) Every day, my Automatic Spyware Tester browses the web on multiple spyware-infected PCs, watching for spyware-delivered advertising and recording its observations in videos and packet logs.

Although VistaPrint’s Robert Keane claims to effectively oversee VistaPrint’s marketing practices, I emphatically disagree. To the contrary, I’ve seen ample evidence of VistaPrint promoted by spyware and adware programs that sneak onto users’ computers without consent (including through security exploits) and through ruse and deception. In many instances, including as detailed in the examples that follow, the corresponding affiliates trick marketing analytics — claiming commission on sales that would have happened anyway, and thereby overstating the true effectiveness of their marketing efforts.

When VistaPrint is cheated by rogue marketing partners, the costs fall in the first instance to VistaPrint shareholders. Every dollar wasted on worthless advertising leaves that much less for corporate profits, and VistaPrint’s advertising budget is already strikingly large: In 2008, VistaPrint marketing consumed 31.9% of revenue (more than $125 million) while profits were just 9.9% ($39.7 million). Meanwhile, fraud against VistaPrint also harms the general public: Consumers suffer unwanted installations of spyware programs funded, in part, by theft from VistaPrint.

The following table summarizes my recent observations of fraud against VistaPrint:

Ad network Example incident Rogue VistaPrint incidents observed
August – September 2008 January – July 2008
Number of affiliates Number of dates Number of observations Number of observations
Lynxtrack Vomba, Hydra Network Affiliate 19934 6 13 18 32
Clickbooth Vomba, Clickbooth Affiliate 14941
WhenU, MediaTraffic, Iadsdirect, Clickbooth Affiliate 7781
5 13 14 14
CPA Builder (including traffic from Revenue Gateway, from OptInRealBig / CPAEmpire, and from XY7) Zango, Revenue Gateway Affiliate 12489, CPA Empire, CPA Builder 2 8 9 21
CX Digital Media (Incentaclick) Vomba, Weclub, CX Digital Media Affiliate 13736 2 2 2 18
Performics (Google) Deluxe Communications, Smartyseek, Performics 1 5 5 5
direct relationships & other networks
not yet tabulated in full – some examples on file

During August-September 2008, my AutoTester repeatedly observed VistaPrint facing rogue traffic coming from five different ad networks. In the sections that follow, this piece presents an example of fraud by an affiliate from each of the specified networks. But I’ve seen plenty more. My AutoTester has been running for more than a year — preserving tens of thousands of records of online advertising fraud, including 133 other spyware incidents arising out of traffic to VistaPrint. These many incidents confirm the breadth of improper practices by VistaPrint’s marketing partners.

Example 1: Vomba, Hydra Network Affiliate 19934 Claiming Commission on VistaPrint’s Organic/Type-In Traffic

Vomba, Lynxtrack Affiliate 19334 Targeting VistaPrintVomba, Hydra Network Affiliate 19334 Targeting VistaPrint

In testing on September 12, my AutoTester browsed VistaPrint’s site on a computer with Vomba (from Integrated Search Technologies, makers of Slotchbar, XXXtoolbar, WhenU, AdVantage, and more). Vomba popped open a window that sent traffic to Hydra Network (LynxTrack) (affiliate 19934), and Hydra Network in turn forwarded the traffic back to VistaPrint. The result was the screen shown at right — the original VistaPrint window at left/back, with a new popup at front/right.

Crucially, both web browser windows share a single set of cookies. Whether the user buys from the original VistaPrint window or from the popup, cookies tell VistaPrint that this Hydra Network affiliate caused the sale. So VistaPrint will pay this affiliate a commission — even though, in fact, the affiliate did nothing whatsoever to facilitate the sale. I call this tactic “self-targeting” — reflecting that Vomba covers VistaPrint with its own ad. All of the examples presented on this page entail spyware/adware performing this kind of self-targeting attack.

My AutoTester preserved a video of this incident and a packet log of the underlying network traffic.

My AutoTester observed this same affiliate using the same method on three different dates in August-September 2008. My AutoTester also observed five other Hydra Network affiliates similarly defrauding VistaPrint. All told, in August-September, my AutoTester observed 18 such incidents on 13 distinct dates.

My AutoTester’s records indicate that Hydra Network receives substantial spyware-originating traffic. Looking back to June 2007, across all my AutoTester’s browsing, my AutoTester has seen a remarkable 1,287 instances of spyware sending traffic to/through Hydra Network.

Example 2: Vomba, Clickbooth Affiliate 14941 Claiming Commission on VistaPrint’s Organic/Type-In Traffic

In testing on September 12, my AutoTester browsed VistaPrint’s site, again on a computer with Vomba. Vomba popped open a window that sent traffic to Clickbooth (affiliate 14941), and Clickbooth in turn forwarded the traffic back to VistaPrint.

Because both web browser windows share a single set of cookies, this Clickbooth affiliate gets paid a commission whether the user buys from the original VistaPrint window or from the popup. This commission gets paid even though, in fact, the affiliate did nothing whatsoever to facilitate the sale.

My AutoTester preserved a video of this incident and a packet log of the underlying network traffic.

My AutoTester observed this same affiliate using the same tactics on eight different dates in August-September 2008. My AutoTester also observed three other Clickbooth affiliates similarly defrauding VistaPrint. All told, my AutoTester observed 13 such incidents on 12 distinct dates.

My AutoTester’s records indicate that Clickbooth receives substantial spyware-originating traffic. Looking back to June 2007, across all my AutoTester’s browsing, my AutoTester has seen 917 instances of spyware sending traffic to/through Clickbooth.

Example 3: WhenU, MediaTraffic, Iadsdirect, Clickbooth Affiliate 7781 Claiming Commission on VistaPrint’s Organic/Type-In Traffic

In manual testing on September 28, I browsed VistaPrint’s on a computer with WhenU. WhenU opened a popunder that flashed briefly on screen (video at 0:15) but then forced itself to an off-screen location where I could not see it even if I minimize other windows. (See video at 0:24 to 0:30, when I attempted to find the popunder.) By manually right-clicking and choosing “maximize,” I managed to make the popunder visible — confirming that it loaded VistaPrint and noting the affiliate ID number.

Packet log analysis reveals that traffic flowed from WhenU to MediaTraffic (a pay-per-view advertising marketplace also operated by Integrated Search Technologies) to Iadsdirect to Clickbooth (affiliate 7781) to VistaPrint.

As in prior examples, both windows share a single set of cookies. Thus, the WhenU popunder causes the corresponding affiliate to receive a commission if the user makes a purchase — even though the affiliate did nothing to encourage or facilitate a purchase.

I preserved a video of this incident and a packet log of the underlying network traffic.

This advertising fraud by WhenU is particularly notable because WhenU previously claimed to have reformed all unsavory practices. (See e.g. “WhenU CEO Bill Day Cleans House.”) Moreover, WhenU previously touted a TRUSTe Trusted Download certification, and TRUSTe specifically prohibits Trusted Download programs from defrauding advertisers. (See Certification Agreement, Schedule A (“Program Requirements”), provision 14.k.) That said, WhenU has silently left the Trusted Download whitelist. Furthermore, in separate testing of WhenU software, I have recently seen repeated self-targeting fraud improperly claiming commissions from a variety of advertisers.

Example 4: Zango, Revenue Gateway Affiliate 12489, CPA Empire, CPA Builder Claiming Commission on VistaPrint’s Organic/Type-In Traffic

VistaPrint
money viewers
   CPA Builder    
money viewers
   CPA Empire    
money viewers
   Revenue Gateway    
money viewers
Zango

The Money Trail and Traffic Flow

In testing on September 21, my AutoTester browsed VistaPrint’s site on a computer with Zango. Zango popped open a window that sent traffic to Revenue Gateway (affiliate 12489), which redirected to CPA Empire (formerly OptInRealBig), which redirected to CPA Builder, which in turn forwarded the traffic back to VistaPrint.

The chain of intermediaries adds additional complexity to the relationships. But traffic flows in a continuous forward path: From Zango to Revenue Gateway to CPA Empire to CPA Builder and finally back to VistaPrint. Conversely, revenue flows in the opposite direction: From VistaPrint to CPA Builder to CPA Empire to Revenue Gateway to Revenue Gateway affiliate 13425 to Zango. The diagram at right summarizes the flows of traffic and money.

My AutoTester preserved a video of this incident and a packet log of the underlying network traffic.

During August-September 2008, my AutoTester also observed other incidents wherein spyware waited for a user to browse the VistaPrint site, then sent the user back to VistaPrint via CPA Builder. Beyond this Zango / Revenue Gateway / CPA Empire example, I also observed incidents wherein CPA Empire’s relationship with XY7 was the source of the tainted traffic. All told, my AutoTester has preserved more than 600 incidents of spyware sending traffic to/through CPA Empire, as well as at least 24 incidents of spyware sending traffic to/through Revenue Gateway (though I have reason to believe that some Revenue Gateway incidents were not preserved).

Example 5: 8/17/08 – Vomba, Weclub, CX Digital Media (Incentaclick) Affiliate 13736 Claiming Commission on VistaPrint’s Organic/Type-In Traffic

Vomba, Weclub, CX Digital Media Affiliate 13736 Targeting VistaPrint Vomba, Weclub, CX Digital Media Affiliate 13736 Targeting VistaPrint

In testing on August 17, my AutoTester browsed VistaPrint’s site on a computer with Vomba. Vomba popped open a window that sent traffic to Weclub, which immediately redirected to CX Digital Media (Incentaclick), which in turn forwarded the traffic back to VistaPrint.

See the screenshot at right. My AutoTester preserved a video of this incident and a packet log of the underlying network traffic.

During August-September 2008, my AutoTester also observed another CX Digital Media affiliate using spyware to claim commission on VistaPrint’s organic traffic. All told, my AutoTester has preserved more than 200 different incidents of spyware sending traffic to/through CX Digital Media.

Example 6: Deluxe Communications, Smartyseek, Performics Claiming Commission on VistaPrint’s Organic/Type-In Traffic

In testing on September 14, my AutoTester browsed VistaPrint’s site on a computer Deluxe Communications (which I have repeatedly observed installed through security exploits and otherwise without user consent). Deluxe Communication popped open a window that sent traffic to Smartyseek, which immediately redirected to Performics, then back to VistaPrint.

In typical Deluxe Communications fashion, the popup window entirely covered the window the user had been browsing. But because both windows showed VistaPrint, some users might not notice.

My AutoTester preserved a video of this incident and a packet log of the underlying network traffic.

My AutoTester observed this same affiliate using the same tactics on five different dates in August-September 2008, and my AutoTester also observed Performics traffic during VistaPrint browsing on five other (prior) occasions.

Responsibility and Causation

It’s easy to present VistaPrint as perpetrator: VistaPrint fails to adequately oversee its marketing partners. As a result, VistaPrint’s advertising spending helps fund spyware and adware programs that sneak onto users’ PCs, with serious harms to performance, reliability, and privacy.

But I also see an important sense in which VistaPrint is a victim: VistaPrint’s marketing partners are defrauding VistaPrint by claiming commissions on sales they actually did nothing to cause. Such commissions are entirely wasted, yielding no bona fide marketing benefit to VistaPrint.

By all indications, VistaPrint faces significant difficulties in supervising its marketing partners. Yet other major retailers handle such challenges with greater success. For example, it is comparatively rare to see spyware or adware promoting, defrauding, or attempting to defraud Amazon — even though Amazon spends nearly three times as much on marketing as VistaPrint ($344 million to $125 million).

What could VistaPrint do differently? For one, I question VistaPrint’s choice of marketing partners: As the preceding statistics indicate, I have repeatedly and widely seen spyware and adware sending traffic to many of the partners VistaPrint works with. VistaPrint might face less fraud if it favored marketing partners with a track record of successful supervision of their affiliates.

More generally, an affiliate currently faces little real downside to attempting to defraud VistaPrint. If an affiliate gets caught cheating, VistaPrint will terminate that affiliate, but I see little indication that VistaPrint exacts any meaningful penalty to make the affiliate (or the network providing that affiliate) regret its transgression. In Deterring Online Advertising Fraud Through Optimal Payment in Arrears, I suggest a different approach — paying affiliates more slowly so that they face greater losses if they are found to be cheating. Alternatively, VistaPrint might sue affiliates it learns are cheaters, as in eBay v. Digital Point Solutions and Lands’ End v. Remy.

Yet Keane’s remarks (“highly analytically driven fact-based decision-making”) reveal that VistaPrint is at least attempting to supervise its marketing partners to optimize its spending. How, then, could VistaPrint end up facing so much fraud? I suspect VistaPrint’s analytics actually lead the company astray. Consider the tactics presented above, from the perspective of the information easily available to VistaPrint’s marketing staff. Because these affiliates target users who are already interested in VistaPrint, the affiliates’ conversion rates are likely to be well above average. Moreover, because these affiliates incur limited costs, they can accept payments far below what Google might require. Thus, VistaPrint’s staff are likely to assess these affiliates favorably — without realizing that the traffic at issue is traffic VistaPrint would otherwise have gotten for free. Put differently: Although VistaPrint’s measurements may be very precise, they’re inaccurate because VistaPrint misunderstands the sources of affiliates’ traffic.

In attempting to prevent such fraud, VistaPrint should also examine its ad networks’ incentives. Ad networks often mark up affiliates’ fees: For every dollar VistaPrint is slated to pay to a given affiliate, that affiliate’s network takes another (say) $0.20. As a result, ad networks have a clear incentive to tolerate rogue affiliates: Networks make money from each sale credited to an affiliate, so ejecting rogue affiliates would directly reduce the network’s earnings.

The Big Picture

Spyware-based advertising fraud extends far beyond VistaPrint. Most merchants operating affiliate, CPA, or other conversion-contingent programs face similar fraud. But VistaPrint is a large and, purportedly, sophisticated advertiser. So VistaPrint could appropriately lead by example.

I’m overdue to present further examples of spyware and adware continuing to defraud major merchants. Historically my articles have tended to emphasize the largest US affiliate networks — Commission Junction, LinkShare, Performics. But there’s plenty of fraud through smaller networks too, as well as through networks based outside the US. I’ll present additional examples later this fall.

In January, an Anti-Spyware Coalition workshop asked “Is adware dead?” Some panelists responded substantially in the affirmative. But my AutoTester indicates otherwise. I’m pleased to see that big advertisers no longer advertise directly with major adware vendors. Yet a chain of indirection — adware sending traffic to one ad network, which forwards to another, then finally to an advertiser — continues to promote top brands. Furthermore, spyware-delivered banner farms and ad-loaders are becoming increasingly widespread. This month I saw adware still promoting American Express, Apple, and AT&T — to name just a few of the A’s. There’s plenty of work left to be done.

Debunking Zango’s "Content Economy" updated May 29, 2008

Zango often touts its so-called “content economy” — purportedly providing users access to media in exchange for accepting Zango’s popup ads. After four years of debunking Zango’s claims, I’ve come to suspect the worst — and my investigations of Zango’s media offerings confirm that Zango’s media library is nothing to celebrate. This article reports the results of my recent examinations. I show:

  • Widespread copyrighted video content presented without any indication of license from the corresponding rights-holders. Details.
  • Widespread sexually-explicit material, including prominent explicit material nowhere labeled as such. Details.
  • An audio library consisting solely of prank phone calls to celebrities (without the “music” Zango promises). Details.
  • Widespread material users can get elsewhere for free, without any popups or other detriments. Details.
  • Widespread material that content creators never asked to have included in any Zango library. Details.

Widespread copyrighted video content presented without any indication of license from the corresponding rights-holders

Many of the videos in of Zango’s video library are the work of major movie studios, TV networks, and other third parties that own and assert copyright in their respective works. These videos consistently appear without any statement of authorization (e.g. “used with permission”) or even the ordinary copyright notice. I therefore conclude that Zango’s site features these videos without authorization from the corresponding rights-holders.

Zango Offers Daily Show with Guest Chris Rock Zango Offers Daily Show w/ Chris Rock

Zango Offers 'Borat' Zango Offers Borat

For many videos in Zango’s library, it is trivially easy to determine the video’s source. For example, text in the corner of Zango’s “Ashley Judd Nude Photoshoot” indicates the video comes from “Norma Jean & Marilyn” (1996, released on DVD by HBO Home Video). The title of Zango’s “Wild Things” suggests the video comes from the 2004 Sony Pictures movie by the same name; watching the video confirms the match. Zango’s “Girls Next Door Nude Compilation” begins with the distinctive Playboy logo. Zango’s “Chris Rock on the Daily Show” reproduces a video clip from Comedy Central’s Daily Show. It’s easy to find scores of other examples plainly labeled as well-known copyrighted works.

Other videos in Zango’s library are harder to identify — at least those without extensive entertainment industry experience. For example, I cannot easily determine the specific movie that included the scenes shown in Zango’s “Paris Hilton Striptease” or “Rachel Hunter in the Bathtub.” But the clips leave little doubt that they were filmed professionally and that the respective studios hold copyright in the resulting works. Similarly, I cannot easily determine the specific source of Zango’s “Branding Beat Down.” However, every frame of the video bears the distinctive Fox logo — indicating that the video originated with the Fox Broadcasting Company.

As to at least eight of the files in Zango’s library, I have specifically confirmed that Zango’s reproduction occurs without authorization from the underlying rights-holders. (Details below.) As to selected other files, I have sent inquiries to the corresponding rights-holders. I will update this page if I confirm whether Zango has properly licensed the content at issue.

Infringing videos are remarkably prominent in Zango’s video library. For example, as of May 27, Zango’s home page linked to “Borats First Trip To An American Gym” (s.i.c.). This clip was listed as the second most popular video in Zango’s entire content library, and it was placed in the top-center of Zango’s main www.zango.com web page, “above the fold” (within the portion of the page visible without using scroll bars). Yet the title of the video plainly indicates that the video contains the copyrighted work of others. Moreover, the video features the “DIVX Video” logo, indicating that DivX software was used to extract (“rip”) the video from a DVD. No authorized reproduction would be provided with a DivX overlay, so the presence of the DivX marker confirms that this video was reproduced without permission from the creators of Borat.

Other online video sites have been the target of major copyright litigation. For example, Viacom last year sued Google, alleging that “YouTube appropriates the value of creative content on a massive scale for YouTube’s benefit without payment or license.” In defense, Google points out that YouTube receives videos from independent — potentially granting Google immunity for these infringements due to the Digital Millennium Copyright Act‘s safe harbor for infringements occurring at the direction of users (17 USC 512(c)(1)).

Unlike YouTube, Zango’s video library offers no prominent “upload” function. Some of Zango’s videos arrive through the Revver video-sharing service (discussed below), probably originating with a variety of independent users. But many of the copyrighted videos Zango offers reside on Zango’s servers, not on Revver servers. (For example, all eight of the sexually-explicit videos linked in the first paragraph of the next section are hosted on Zango servers.) Because Zango offers no “upload” function by which ordinary users could have put videos onto Zango’s site, it therefore appears that these videos were provided by Zango or its agents, not by independent users. If so, Zango will not find protection in the DMCA’s safe harbor for infringements caused by users.

Moreover, even if Zango’s videos were provided by independent users, the circumstances of the reproduction seem to render Zango ineligible for the DMCA safe harbor. For one, the safe harbor requires that Zango lack actual knowledge of the infringements. But the infringing videos were obvious and self-evident, not just from their titles and contents, but also from their prevalence in featured results Zango chose to highlight. In addition, the safe harbor requires that Zango not receive a financial benefit directly attributable to the infringements. But Zango used these videos to induce users to download its popup-generating software, a financial benefit that is directly attributable to the infringing videos. (Consider the case of a user who installs Zango in response to solicitation offering a specific copyrighted video clip. Example.) Furthermore, Zango has the right and ability to control the infringement (e.g. by removing the infringing videos). Because Zango’s financial benefit can be directly tracked to a specific infringement, and because Zango has the right and ability to prevent such infringement, Zango seems to fail the test in 17 USC 512(c)(1)(B).

Zango may claim that its videos are fair use. The Copyright Act sets out a four-factor test for determining whether reproduction of a copyrighted work is permissible, despite lack of authorization from the rights-holder. The fair use test calls for considering 1) the purpose and character of the use (e.g. whether commercial or nonprofit), 2) the nature of the copyrighted work, 3) the amount and substantiality of the portion used, and 4) the effect of the use upon the potential market for the work. Factor one is easy: Zango’s use is clearly commercial, which tends to cut against a finding of fair use. Zango might claim that its presentation of excerpts (rather than entire movies) supports a finding of fair use under the third test — but Zango exactly chooses what it views as highlights (e.g. the explicit portions of full-length movies), yielding clips with a greater than usual effect on the potential market for the underlying works. In short, a fair use defense is at best uncertain.

Wide-scale copyright infringement could expose Zango to substantial liability. The Copyright Act provides for statutory damages of “not less than $750” per violation. My examination indicates Zango is reproducing (at least) hundreds of copyrighted videos without any statement of authorization. Furthermore, such videos have surely been downloaded repeatedly — giving rise to potential statutory damages that could easily reach seven digits or more.

Widespread sexually-explicit material, including prominent explicit material nowhere labeled as such

Celebrity Videos Featured by Zango Celebrity Videos Featured by Zango

Prominent Video - Explicit but Unlabeled
Prominent Video – Explicit but Unlabeled

Browse Zango’s video library, and it’s easy to find sexually-explicit video. As shown in the first inset image at right, the bottom-right corner of each Zango “Browse” page gives a list of celebrities — each of them female, each featured in various states of undress. Among other explicit videos of these celebrities, Zango offers “Britney Spears See Thru“, “Britney Spears Black Dress Upskirt“, “Paris Hilton Striptease“, “Rachel Hunter in the Bathtub“, “Jessica Alba’s Chest and You“, “Jessica Simpson Nipple Slip“, “Anna Kournikova Panties Oops“, and “Angelina Jolie Sex Scene.”

The titles and descriptions of many of Zango’s videos suggest that their subjects were unwilling participants. See e.g. “nipple slip” and “upskirt” above, as well as additional videos like Zango’s “Arab wife’s sexy dance secretly taped” and Zango’s “Girlfriend Finds Hidden Camera.”

Through its placement and labeling of sexually-explicit videos, Zango creates a substantial risk that users will receive explicit materials they did not seek. For example, on May 24, I clicked “Browse” to flip through Zango’s content library. Using Zango’s default sort, the third video was entitled “the pool” with comment “havin fun in the pool” (s.i.c.). (Screenshot of the link from within Zango’s video library.) This title and comment give no indication that the resulting material is explicit. But clicking the “Watch” button immediately yields a large video showing two male adults swimming nude, then exiting the pool (entirely disrobed). As best I can tell, Zango did nothing to alert users to this explicit material, nor does Zango prevent (or even discourage) children from viewing such material.

Zango’s May 24 “the pool” video was not a mere anomaly. The same video remained linked in the same way in my tests on May 25 and 26, and on portions of May 27.

In litigation documents, Zango last week claimed that it never distributes explicit material to those do not want it. In particular, Zango argues: “Zango never sends unwanted links to pornography web sites” and “Zango only directs adult-oriented advertisements to a user after that user, by his own behavior, has demonstrated interest in such content.” I disagree. The preceding paragraphs offer a counterexample — Zango prominently providing a link to sexually-explicit materials, and provideing that links to users who never demonstrated interest in any such content. Zango may claim that these links tout videos — not a “web site” as in the first quoted sentence. Alternatively, Zango may claim that the links are not “advertisements” — hence beyond a strict reading of the second quoted sentence. But the underlying contradiction remains: Zango says it doesn’t provide pornography except when users seek it; yet in fact Zango does sometimes deliver explicit materials unrequested.

That Zango funds and distributes sexually-explicit materials is well-known. See e.g. the Sunbelt Blog’s February 2008 conclusion that “80% of [Zango’s] business comes from Seekmo, the porn side of its business.” See also Sunbelt’s off-hand November 2006 remark that “hardcore porno videos [are] funded through Zango Seekmo installs.”

But the scope of explicit materials within Zango’s video library is quite striking. Consider the first page of Zango’s library listings for Angeline Jolie. Beyond the “sex scene” video linked above, the listings also include “Angelina Jolie Taking a Bath”, “Angelina Jolie Under the Sheets”, “Angelina Jolie in Bra & Panties”, “A fairly long nude scene staring Angelina Jolie” (s.i.c.), “Angeline Jolie Getting It On”, “Angelina Jolie Nip Slip”, “Angelina Jolie Hardcore”, and “Angelina Jolie Dominatrix”, and “Angelina Jolie Hot On The Runway.” That’s ten explicit results out of twenty links — suggesting that explicit materials are remarkably widespread on Zango’s site.

The initial version of this article also flagged Zango’s “Nice But” (s.i.c.), a video that on May 27 occupied the fourth-most prominent position in Zango’s “Browse” listings. The thumbnail image of this video appeared to feature a full-screen display of a man’s naked buttocks, filling the entire screen. In a follow-up, Zango points out that in fact, the video shows an extreme close-up zoom of of two hands. So this image and video are not actually explicit. Yet a viewer merely flipping through Zango’s listings would nonetheless see an image that is, by all indications, explicit. The title “but” (s.i.c.) and the keyword “naked,” both adjacent to the thumbnail, reinforce the user’s perception of having seen an unrequested explicit image. Although the image is not actually explicit, the image’s content, placement, and labeling make it likely to leave users with the same feeling as an unrequested image that is actually sexually explicit: In both instances, a viewer who merely sees the image and does not watch the video will think he has seen an unwanted explicit image. In my view, Zango errs in mocking this harm. To the users who Zango tricks, the harm is perfectly real.

Zango’s audio library consists solely of prank phone calls to celebrities

Zango Offers Prank Phone Call Recordings Zango Offers Prank Phone Call Recordings

Zango’s content library offers three types of media: Videos, screensavers, and audio. Despite Zango’s much-touted “content economy,” Zango offers just eight audio clips. And although Zango’s “About Zango” description promises to provide free access to “music,” in fact all eight of these audio files are recordings from talk radio — just voices, with no music at all.

All eight of Zango’s audio recordings share a common theme: Prank phone calls to celebrities. In each, a caller pretends to be someone famous (e.g. the Prime Minister of Canada), and calls a celebrity (e.g. Bill Gates) under the guise of a bona fide discussion. The caller proceeds to berate the celebrity (e.g. by criticizing the features and reliability of Windows).

A comment in several of the videos reveals the source of the recordings: The Masked Avengers, which Wikipedia describes as “a Canadian radio duo … of disk jockeys and comedians Sebastien Trudel and Marc-Antoine Audette, known for making prank calls to famous persons by pretending to be government officials or officers in charitable organizations.” I wrote to Mr. Trudel, who confirmed to me that he has not granted Zango any license to use or reproduce these clips.

After placing these recordings in its content library, Zango further syndicates the materials onto Zango’s partner sites. For example, celebsprankd.com (screenshot) features all eight recordings, but requires users to install Zango before listening. Whois reports that Celebsprankd comes from the Vancouver, B.C. advertising firm Neverblue Media — a conclusion confirmed by the presence of the Neverblue.com web server at the same IP address. Neverblue describes itself as a “leading … online marketing company” offering “premier” advertising and “solid business leads” — claims arguably inconsistent with distributing and profiting from prank phone calls, not to mention distributing Zango. (But these recordings aren’t Neverblue’s only tie to Zango. This month alone, my Automatic Spyware Tester found eleven incidents of Neverblue affiliates buying popup traffic from Zango. I’ve also found dozens more incidents as to Neverblue affiliates buying traffic from other spyware.)

What of Zango’s distribution of these prank call recordings? With so few clips yet such prominent placement (including five of these eight audio recordings featured on Zango’s home page), senior Zango staff surely know what the files contain. Does Zango support prank phone calls? Wasting celebrities’ time under false pretenses? Recording phone calls without permission, even in states that specifically require such permission? It’s hard to reconcile these practices with Zango’s supposed reforms.

Widespread material users can get elsewhere for free, without any popups or other detriments

Much of Zango’s content is available elsewhere without charge and without installing any software that tracks online behavior or shows popup ads. For example, clicking Zango’s “Browse” tab and retaining defaults, every single video on the first page of results is syndicated from Revver. Users could just as easily get these videos directly from Revver, as receive them from Zango. But if users watched these videos at Revver, Zango’s software would not track their web browsing and searching, and users would not receive Zango’s popup ads.

Zango Falsely Claims that Uninstallation Eliminates Content Access Zango Falsely Tells Its Users:
“Uninstallation … eliminates content access”

Furthermore, Zango makes untrue claims about the necessity of its software. For example, Zango claims that “uninstallation … eliminates content access.” It does not. For files hosted at Revver, installation of Zango is not necessary to watch the videos in the first place, and uninstallation does not interfere with watching the videos later. Moreover, even many Zango-hosted files can be accessed without installing Zango, or after uninstalling Zango. For example, Zango’s “Chris Rock on the Daily Show” is actually just a standard Windows Media Video (WMV) distributed from the following URL: preview.licenseacquisition.org/123/1054944882.36393/yikers_chris_rock_on_the_daily_show.wmv . Zango’s “Borats First Trip To An American Gym” (s.i.c.) is preview.licenseacquisition.org/123/1054944854.02531/yikers_borats_first_trip_to_an_american_gym.wmv . Similarly, Zango’s “Bill Gates Gets Pranked” is a WMA hosted at preview.licenseacquisition.org/13/12295/12295.wma . Any user who knows these URLs can easily receive the corresponding files — without ever installing Zango, or after uninstalling Zango. Zango ought not claim otherwise.

Presenting material that content creators never asked to have included in any Zango library

By syndicating videos from Revver, Zango causes its video library to feature materials that content creators never asked to have associated with Zango in any way.

Zango’s syndication of Revver videos has prompted numerous complaints content creators who post videos to Revver. For example, Chris Pirillo asked why his videos are appearing on Zango. (“I don’t remember giving Zango permission to push crapware on my behalf.”) Revver forum user JPPI pointed out the irony of Zango claiming his videos were “FREE, thanks to Zango” when in fact the videos were free all along (even before Zango syndicated them). Revver forum user David complained that it is “kinda deceptive” (s.i.c.) “to make it sound like Zango was the one who made the video free.”

In response, Revver Vice President Asi Behar agreed to ask Zango to remove any Revver videos that Revver authors specifically so designate. But such removals do nothing to cure the deception of Zango requiring that users install its software before watching materials widely available elsewhere for free. Furthermore, such removals do nothing to protect Revver content creators who are unaware of Revver’s relationship with Zango. The word “Zango” appears nowhere on Revver’s official web site (as distinguished from Revver’s forums and some Revver-hosted videos). Thus, a Revver content creator has no easy way to learn about Revver’s relationship with Zango — not to mention learn of the option to request exclusion from Zango.

Zango’s syndication of Revver videos risks tainting the good name of Revver content creators. Consider a user who searches for a Revver video and finds that video hosted at Zango (just as Chris Pirillo did last year). The user may mistakenly conclude that installing Zango is in fact necessary to watch the video. If so, the user is likely to end up with a negative view of the underlying content creator — mistakenly concluding that, e.g., Chris Pirillo has partnered with Zango or endorses Zango’s activities. Revver forum complaints indicate that numerous Revver users share this concern. Yet Revver continues to syndicate videos to Zango without first checking with content creators.

Zango’s problems in context

Last week, Zango was one of four finalists for the Software & Information Industry Association’s CODiE Best Video Content Aggregation Service. In my view, that award is misguided: Far from deserving praise, Zango should be criticized and shunned for reproducing others’ copyrighted work without any apparent license to do so, showing sexually-explicit material unrequested, and offering users a lousy value by bundling extra ads with content users could get elsewhere for free.

Meanwhile, Zango continues litigation with Kaspersky. Recall: Kaspersky blocked Zango’s software from installing; Zango sued; Kaspersky successfully defended on the grounds that the Communications Decency Act, 47 USC 230, immunizes Kaspersky’s behavior because Kaspersky is an “interactive computer service provider” blocking material that, in its subjective opinion, is “objectionable.” In Zango’s appeal, Zango claims its software is not “otherwise objectionable” (brief pages 12-15; PDF pages 17-20). If it’s not objectionable to show explicit material unrequested — not to mention to infringe copyrights on a massive scale, and to insert extra ads around material available elsewhere without such ads – then I don’t know what is.

Finally, I’m often asked whether Zango continues the behaviors I previously reported. Installing through sneaky fake-user-interface pop-up ads that mimic the appearance of official Windows dialog boxes (as I reported last summer)? Yes. I made a fresh video showing such installations just last week.Defrauding advertisers through popups that cover merchants’ sites with their own affiliate offers(as I reported last spring, in September 2005, in summer 2004, and otherwise)? Definitely. This month alone, I reported six Zango incidents to just one of my advertiser clients — not to mention scores of other incidents targeting other web sites and advertisers. Zango repeatedly claims its problems are all in the past, but my hands-on testing continues to indicate otherwise.