Honey’s Dieselgate: Detecting and Tricking Testers

MegaLag’s December 2024 video introduced 18 million viewers to serious questions about Honey, the widely-used browser shopping plug-in—in particular, whether Honey abides by the rules set by affiliate networks and merchants, and whether Honey takes commissions that should flow to other affiliates.  I wrote in January that I thought Honey was out of line.  In particular, I pointed out the contracts that limit when and how Honey may present affiliate links, and I applied those contracts to the behavior MegaLag documented.  Honey was plainly breaking the rules.

As it turns out, Honey’s misconduct is considerably worse than MegaLag, I, or others knew.  When Honey is concerned that a user may be a tester—a “network quality” employee, a merchant’s affiliate manager, an affiliate, or an enthusiast—Honey designs its software to honor stand down in full.  But when Honey feels confident that it’s being used by an ordinary user, Honey defies stand down rules.  Multiple methods support these conclusions: I extracted source code from Honey’s browser plugin and studied it at length, plus I ran Honey through a packet sniffer to collect its config files, and I cross-checked all of this with actual app behavior.  Details below.  MegaLag tested too, and has a new video with his updated assessment.

(A note on our relationship: MegaLag figured out most of this, but asked me to check every bit from first principles, which I did.   I added my own findings and methods, and cross-checked with VPT records of prior observations as well as historic Honey config files.  More on that below, too.)

Behaving better when it thinks it’s being tested, Honey follows in Volkswagen’s “Dieselgate” footsteps.  Like Volkswagen, the cover-up is arguably worse than the underlying conduct.  Facing the allegations MegaLag presented last year, Honey could try to defend presenting its affiliate links willy-nilly—argue users want this, claim to be saving users money, suggest that network rules don’t apply or don’t mean what they say.  But these new allegations are more difficult to defend.  Designing its software to perform differently when under test, Honey reveals knowing what the rules require and knowing they’d be in trouble if caught.  Hiding from testers reveals that Honey wanted to present affiliate links as widely as possible, despite the rules, so long as it doesn’t get caught.  It’s not a good look.  Affiliates, merchants, and networks should be furious.

What the rules require

The basic bargain of affiliate marketing is that a publisher presents a link to a user, who clicks, browses, and buys.  If the user makes a purchase, commission flows to the publisher whose link was last clicked.

Shopping plugins and other client-side software undermine the basic bargain of affiliate marketing.  If a publisher puts software on a user’s computer, that software can monitor where the user browses, present its affiliate link, and always (appear to) be “last”—even if it had minimal role in influencing the customer’s purchase decision.

Affiliate networks and merchants established rules to restore and preserve the bargain between what we might call “web affiliates” versus software affiliates.  One, a user has to actually click a software affiliate’s link; decades ago, auto-clicks were common, but that’s long-since banned (yet nonetheless routine from “adware”-style browser plugins—example).  Two, software must “stand down”—must not even show its link to users—when some prior web affiliate P has already referred a user to a given merchant.  This reflects a balancing of interests: P wants a reasonable opportunity for the user to make a purchase, so P can get paid.  If a shopping plugin could always present its offer, the shopping plugin would claim the commission that P had fairly earned.  Meanwhile P wouldn’t get sufficient payment for its effort—and might switch to promoting some other merchant with rules P sees as more favorable.  Merchants and networks need to maintain a balance in order to attract and retain web affiliates, which are understood to send traffic that’s substantially incremental (customers who wouldn’t have purchased anyway), whereas shopping plugins often take credit for nonincremental purchases.  So if a merchant is unsure, it has good reason to err on the side of web affiliates.

All of this was known and understood literally decades ago.  Stand-down rules were first established in 2002.  Since then, they’ve been increasingly routine, and overall have become clearer and better enforced.  Crucially, merchants and networks include stand-down rules in their contracts, making this not just a principle and a norm, but a binding contractual obligation.

Detecting testers

How can Honey tell when a user may be a tester?  Honey’s code and config files show that they’re using four criteria:

  • New accounts. If an account is less than 30 days old, Honey concludes the user might be a tester, so it disables its prohibited behavior.
  • Low earnings-to-date. In general, under Honey’s current rules, if an account has less than 65,000 points of Honey earning, Honey concludes the user might be a tester, so it disables its prohibited behavior.  Since 1,000 points can be redeemed for $10 of gift cards, this threshold requires having earned $650 worth of points.  That sounds like a high requirement, and it is.  But it’s actually relatively new: As of June 2022, there was no points requirement for most merchants, and for merchants in Rakuten Advertising, the requirement was just 501 points (about $5 of points).  (Details below.)
  • Honey periodically checks a server-side blacklist.  The server can condition its decision on any factor known to the server, including the user’s Honey ID and cookie, or IP address inside a geofence or on a ban list.  Suppose the user has submitted prior complaints about Honey, as professional testers frequently do.  Honey can blacklist the user ID, cookie, and IP or IP range.  Then any further requests from that user, cookie, or IP will be treated as high-risk, and Honey disables its prohibited behavior.
  • Affiliate industry cookies. Honey checks whether a user has cookies indicating having logged into key affiliate industry tools, including the CJ, Rakuten Advertising, and Awin dashboards.  If the user has such a cookie, the user is particularly likely to be a tester, so Honey disables its prohibited behavior.

If even one of these factors indicates a user is high-risk, Honey honors stand-down.  But if all four pass, then Honey ignores stand-down rules and presents its affiliate links regardless of a prior web publisher’s role and regardless of stand-down rules.  This isn’t a probabilistic or uncertain dishonoring of stand-down (as plaintiffs posited in litigation against Honey).  Rather, Honey’s actions are deterministic: If a high-risk factor hits, Honey will completely and in every instance honor stand-down; and if no such factor hits, then Honey will completely and in every instance dishonor stand-down (meaning, present its link despite networks’ rules).

These criteria indicate Honey’s attempt to obstruct and frankly frustrate testers.  In my experience from two decades of testing affiliate misconduct, it is routine for a tester to install a new shopping plugin on a new PC, create a new account, and check for immediate wrongdoing.  By always standing down on new accounts (<30 days), Honey prevents this common test scenario from catching its stand-down violations.  Of course diligent testers will check way past 30 days, but a tester on a short-term contract will perceive nothing amiss.  Some clients may not pay for testers to stick with the task.  And a client for some reason needing an immediate finding—perhaps to dispute Honey’s recent earnings—can’t get it.

The requirement of achieving sufficiently many points similarly stymies many testers.  Over the years, a handful of my clients granted a budget for test purchases.  But most did not.  If I wanted to perform test purchases, I would have needed to spend my own money.  Meanwhile, there’s no proper reason why a shopping app should stand down differently for users with points versus new users, so if testers assumed good faith by Honey, they’d have no reason to build up a points history.  This criteria certainly prevented testers from uncovering what Honey was doing.

The interaction of account age, points history, and blacklist further stymies testing.  Once a tester reports a specific observation of misconduct, Honey can blacklist the tester’s account—which will still do most things normally, but will begin to stand down even though unbanned accounts don’t stand down in the same circumstances.  A tester who suspected this, would have to create a new account, wait 30 days, and build up a sizable points balance to see what Honey is actually doing for other users.  That’s a genuine barrier, impeding even a tester who correctly proposes this hypothesis.

Honey’s detection of affiliate industry cookies is particularly revealing and, in my view, devastatingly incriminating.  Maybe Honey could concoct a story about why it stands down less for established accounts and for high-points-earners.  It would be a tall tale, but maybe Honey could convince some people.  In contrast, it is inconceivable why a shopping plugin should care whether a user previously logged into an affiliate dashboard.  There’s  only one reason to check: to hide from testers.  But that reveals Honey’s intent and Honey’s knowledge that its conduct is prohibited.

Evidence from hands-on testing

Multiple forms of evidence support my finding of Honey detecting testers.  First, consider hands-on testing.  With a standard test account with few or no points, Honey honored stand-down.  See video 1.  But when I tricked the Honey plugin into thinking I had tens of thousands of points (details below about how I did this), Honey popped up despite stand-down rules.  See video 2.  I repeated this test over multiple days, as to multiple merchants.  The finding was the same every time.  The only thing I changed between the “video 1” tests and “video 2” tests was the number of points supposedly associated with my account.

To demonstrate Honey checking for affiliate industry cookies, I added a step to my test scenario. With Honey tricked into thinking I had ample points, same as video 2, I began a test run by logging into a CJ portal used by affiliates.  In all other respects, my test run was the same as video 2.  Seeing the CJ portal cookie, Honey stood down.  See video 3.

Evidence from technical analysis

Some might ask whether the findings in the prior section could be coincidence.  Maybe Honey just happened to open in some scenarios and not others.  Maybe I’m ascribing intentionality to acts that are just coincidence.  Let me offer two responses to this hypothesis.  One, my findings are repeatable, countering any claim of coincidence.  Second, separate from hands-on testing, three separate types of technical analysis—config files, telemetry, and source code—all confirm the accuracy of the prior section.

Evidence from configuration files

Honey retrieves its configuration settings from JSON files on a Honey server. Honey’s core stand-down configuration is in standdown-rules.json, while the selective stand-down—declining to stand down according to the criteria described above—is in the separate config file ssd.json.  Here’s the contents of ssd.json as of October 22, 2025, with // comments added by me

{"ssd": {
"base": {
	"gca": 1, //enable affiliate console cookie check
	"bl": 1,  //enable blacklist check
   "uP": 65000, //min points to disable standdown
  	"adb": 26298469858850
	},
	"affiliates": ["https://www.cj.com", "https://www.linkshare", "https://www.rakuten.com", "https://ui.awin.com", "https://www.swagbucks.com"], //affiliate console cookie domains to check
	"LS": { //override points threshold for LinkShare merchants
		"uP": 5001
	},
	"PAYPAL": {
		"uL": 1,
		"uP": 5000001,
		"adb": 26298469858850
	}
   },
	"ex": { //ssd exceptions
		"7555272277853494990": {  //TJ Maxx
			"uP": 5001
		},
		"7394089402903213168": { //booking.com
			"uL": 1,
			"adb": 120000,
			"uP": 1001
		},
		"243862338372998182": { //kayosports
			"uL": 0,
			"uP": 100000
		},
		"314435911263430900": {
			"adb": 26298469858850
		},
		"315283433846717691": {
			"adb": 26298469858850
		},
		"GA": ["CONTID", "s_vi", "_ga", "networkGroup", "_gid"] //which cookies to check on affiliate console cookie domains
	}
}

On its own, the ssd config file is not a model of clarity.  But source code (discussed below) reveals the meaning of abbreviations in ssd.  uP (yellow) refers to user points—the minimum number of points a user must have in order for Honey to dishonor stand-down.  Note the current base (default) requirement of uP user points at least 65,000 (green), though the subsequent section LS sets a lower threshold of just 5001 for merchants on the Rakuten Advertising (LinkShare) network.  bl set to 1 instructs the Honey plugin to stand down if the server-side blacklist so instructs.

Meanwhile, the affiliates and ex GA data structures (blue), establish the affiliate industry cookie checks mentioned above.  The “affiliates” entry lists domain where cookies are to be checked.  The ex GA data structure lists which cookie is to be checked for each domain.  Though these are presented as two one-dimensional lists, Honey’s code actually checks them in conjunction – checks the first-listed affiliate network domain for the first-listed cookie, then the second, and so forth.  One might ask why Honey stored the domain names and cookie names in two separate one-dimensional lists, rather than in a two-dimensional list, name-value pair, or similar.  The obvious answer is that Honey’s approach kept the domain names more distant from the cookies on those domains, making its actions that much harder for testers to notice even if they got as far as this config file.

The rest of ex (red) sets exceptions to the standard (“base”) ssd.  This lists five specific ecommerce sites (each referenced with an 18-digit ID number previously assigned by Honey) with adjusted ssd settings.  For Booking.com and Kayosports, the ssd exceptions set even higher points requirements to cancel standdown (120,000 and 100,000 points, respectively), which I interpret as response to complaints from those sites.

Evidence from telemetry

Honey’s telemetry is delightfully verbose and, frankly, easy to understand, including English explanations of what data is being collected and why.  Perhaps Google demanded improvements as part of approving Honey’s submission to Chrome Web Store.  (Google enforces what it calls “strict guidelines” for collecting user data.  Rule 12: data collection must be “necessary for a user-facing feature.”  The English explanations are most consistent with seeking to show Google that Honey’s data collection is proper and arguably necessary.)  Meanwhile, Honey submitted much the same code to Apple as an iPhone app, and Apple is known to be quite strict in its app review.  Whatever the reason, Honey telemetry reveals some important aspects of what it is doing and why.

When a user with few points gets a stand-down, Honey reports that in telemetry with the JSON data structure “method”:”suspend”.  Meanwhile, the nearby JSON variable state gives the specific ssd requirement that the user didn’t satisfy—in my video 1: “state”:”uP:5001” reporting that, in this test run, my Honey app had less than 5001 points, and the ssd logic therefore decided to stand down.  See video 1 at 0:37-0:41, or screenshots below for convenience.  (My network tracing tool converted the telemetry from plaintext to a JSON tree for readability.)

Fiddler reference to Honey telemetry transmission

Fiddler decodes Honey JSON telemetry reporting standdown ("method=suspend")

Fiddler decodes Honey JSON telemetry reporting the reason for stand-down, namely insufficient points ("uP"), less than the 5001 threshold applicable for this network

When I gave myself more points (video 2), state instead reported ssd—indicating that all ssd criteria were satisfied, and Honey presented its offer and did not stand down.  See video 2 at 0:32.

Fiddler decodes Honey JSON telemetry reporting the decision not to stand down ("state=ssd")

Finally, when I browsed an affiliate network console and allowed its cookie to be placed on my PC, Honey telemetry reported “state”:“gca”.  Like video 1, the state value reports that ssd criteria were not satisfied, in this case because the gca (affiliate dashboard cookie) requirement was triggered, causing ssd to decide to stand down.  See video 3 at 1:04-1:14.

Fiddler decodes Honey JSON telemetry reporting gca=1, meaning that an affiliate network console cookie was detected.

In each instance, the telemetry matched identifiers from the config file (ssd, uP, gca).  And as I changed from one test run to another, the telemetry transmissions tracked my understanding of Honey’s operation.  Readers can check this in my videos: After Honey does or doesn’t stand down, I opened Fiddler to show what Honey reported in telemetry, in each instance in one continuous video take.

Evidence from code

As a browser extension, Honey provides client-side code in JavaScript.  Google’s Code Readability Requirements allow minification—removing whitespace, shortening variable and function names.  Honey’s code is substantial—after deminification, more than 1.5 million lines.  But a diligent analyst can still find what’s relevant.  In fact the relevant parts are clustered together, and easily found via searches for obvious string such as “ssd”.

In a surprising twist, Honey in one instance released something approaching original code to Apple as  an iPhone app.  In particular, Honey included sourceMappingURL metadata that allows an analyst to recover original function names and variable names.  (Instructions.)  That release was from a moment in time, and Honey subsequently made revisions.  But where that code is substantially the same as the code currently in use, I present the unobfuscated version for readers’ convenience.  Here’s how it works:

First, there’s setup, including periodically checking the Honey killswitch URL /ck/alive:

return e.next = 7, fetch("".concat("https://s.joinhoney.com", "/ck/alive"));

If the killswitch returns “alive”, Honey sets the bl value to 0:

c = S().then((function(e) {
    e && "alive" === e.is && (o.bl = 0)
}))

The ssd logic later checks this variable bl, among others, to decide whether to cancel standdown.

The core ssd logic is in a long function called R() which runs an infinite loop with a switch syntax to proceed through a series of numbered cases.

function(e) {
for (;;) switch (e.prev = e.next) {

Focusing on the sections relevant to the behavior described above: Honey makes sure the user’s email address doesn’t include the string “test”, and checks whether the user is on the killswitch blacklist.

if (r.email && r.email.match("test") && (o.bl = 0), !r.isLoggedIn || t) {
  e.next = 7;
  break

Honey computes the age of the user’s account by subtracting the account creation date (r.created) from the current time:

case 8:
  o.uL = r.isLoggedIn ? 1 : 0, o.uA = Date.now() - r.created;

Honey checks for the most recent time a resource was blocked by an ad blocker:

case 20:
  return p = e.sent, l && a.A.getAdbTab(l) ? o.adb = a.A.getAdbTab(l) : a.A.getState().resourceLastBlockedAt > 0 ? o.adb = a.A.getState().resourceLastBlockedAt : o.adb = 0

Honey checks whether any of the affiliate domains listed in the ssd affiliates data structure has the console cookie named in the GA data structure.

m = p.ex && p.ex.GA || []
g = i().map(p.ssd && p.ssd.affiliates, (function(e) {
            return f += 1, u.A.get({
                name: m[f], //cookie name from GA array
                url: e  //domain to be checked
            }).then((function(e) {
                e && (o.gca = 0) //if cookie found, set gca to 0
            }))

Then the comparison function P() compares each retrieved or calculated value to the threshold from ssd.json.  The fundamental logic is that if any retrieved or calculated value (received in variable e below) is less than the threshold t from ssd, the ssd logic will honor standdown.  In contrast, if all four values exceed the threshold, ssd will cancel the standdown.  If this function elects to honor standdown, the return value gives the name of the rule (a) and the threshold (s) that caused the decision (yellow highlighting).  If this function elects to dishonor standdown, it returns “ssd” (red) (which is the function’s default if not overridden by the logic that folllows).  This yields the state= values I showed in telemetry and presented in screenshots and videos above.

function P(e, t) {
    var r = "ssd";
    return Object.entries(t).forEach((function(t) {
        var n, o, i = (o = 2, _(n = t) || b(n, o) || y(n, o) || g()),
            a = i[0],  // field name (e.g., uP, gca, adb)
            s = i[1];  // threshold value from ssd.json
        "adb" === a && (s = s > Date.now() ? s : Date.now() - s),  // special handling for adb timestamps
        void 0 !== e[a] && e[a] < s && (r = "".concat(a, ":").concat(s))  
    })), r
}

Special treatment of eBay

Reviewing both config files and code, I was intrigued to see eBay called out for greater protections than others.  Where Honey stands down for other merchant and networks for 3,600 seconds (one hour), eBay gets 86,400 seconds (24 hours).

"regex": "^https?\\:\\/\\/rover\\.ebay((?![\\?\\&]pub=5575133559).)*$",
"provider": "LS",
"overrideBl": true,
"ttl": 86400

Furthermore, Honey’s code includes an additional eBay backstop.  No matter what any config file might stay, Honey’s ssd selective stand-down logic will always stand down on ebay.com, even if standard ssd logic and config files would otherwise decide to disable stand-down.  See this hard-coded eBay stand-down code:

...
const r = e.determineSsdState ? await e.determineSsdState(_.provider, v.id, i).catch() : null,
a = "ssd" === r && !/ebay/.test(p);
...

Why such favorable treatment of eBay?  Affiliate experts may remember the 2008 litigation in which eBay and the United States brought civil and criminal charges against Brian Dunning and Shawn Hogan, who were previously eBay’s two largest affiliates—jointly paid more than $20 million in just 18 months.  I was proud to have caught them—a fact I can only reveal because an FBI agent’s declaration credited me.  After putting its two largest affiliates in jail and demanding repayment of all the money they hadn’t spent or lost, eBay got a well-deserved reputation for being smart and tough at affiliate compliance.  Honey is right to want to stay on eBay’s good side.  At the same time, it’s glaring to see Honey treat eBay so much better than other merchants and networks.  Large merchants on other networks could look at this and ask: If eBay get a 24 hour stand-down and a hard-coded ssd exception, why are they treated worse?

Change over time

I mentioned above that I have historic config files.  First, VPT (the affiliate marketing compliance company where I am Chief Scientist) preserved a ssd.json from June 2022.  As of that date, Honey ssd had no points requirement for most networks.  See yellow “base” below, notably in this version including a uP section.  For LinkShare (Rakuten Advertising), the June 2022 ssd file required 501 points (green), equal to about $5 of earning to date.

{"ssd": {
	"base": {"gca": 1, "bl": 1},
	"affiliates": ["https://www.cj.com", "https://www.linkshare", "https://www.rakuten.com", "https://ui.awin.com", "https://www.swagbucks.com"],
	"LS": {"uL": 1, "uA": 2592000, "uP": 501, "SF": {"uP": 200} }, ...

In April 2023, Archive.org preserved ssd.json, with the same settings.

Notice the changes from 2022-2023 to the present—most notably, a huge increase in points required for Honey to not stand-down.  The obvious explanation for the change is MegaLag’s December 2024 video, and resulting litigation, which brought new scrutiny to whether Honey honors stand-down.

A second relevant change is that, as of 2022-2023, the ssd.json included a uA setting for LinkShare, requiring an account age of at least 2,592,000 seconds (30 days).  But the current version of ssd.json has no uA setting, not for LinkShare merchants nor for any other merchants.  Perhaps Honey thinks the high points requirement (65,000) now obviates the need for a 30-day account age.

In litigation, plaintiffs should be able to obtain copies of Honey config files indicating when the points requirement increased, and for that matter management discussions about whether and why to make this change.  If the config files show ssd in similar configuration from 2022 through to fall 2024, but cutoffs increased shortly after MegaLag’s video, it will be easy to infer that Honey reduced ssd, and increased standdown, after getting caught.

Despite Honey’s recently narrowing ssd to more often honor stand-down, this still isn’t what the rules require.  Rather than comply in full, Honey continued not to comply for the highest-spending users, those with >65k points—who Honey seems to figure must be genuine users, not testers or industry insiders.

Tensions between Honey and LinkShare (Rakuten Advertising)

Honey’s LinkShare exception presents a puzzle.  In 2022 and 2023, Honey was stricter for LinkShare merchants—more often honoring stand-down, and dishonoring stand-down only for users with at least 501 points.  But in the current configuration, Honey applies a looser standard for LinkShare merchants: Honey now dishonors LinkShare stand-down once a user has 5,001 points, compared to the much higher 65,000-point requirement for merchants on other networks.  What explains this reversal?  Honey previously wanted to be extra careful for LinkShare merchants—so why now be less careful?

The best interpretation is a two-step sequence.  First, at some point Honey raised the LinkShare threshold from 501 to 5,001 points—likely in response to a merchant complaint or LinkShare network quality concerns.  Second, when placing that LinkShare-specific override into ssd.json, Honey staff didn’t consider how it would interact with later global rules—especially since the overall points requirement (base uA) didn’t yet exist.  Later, MegaLag’s video pushed Honey to impose a 65,000-point threshold for dishonoring stand-down across all merchants—and when Honey staff imposed that new rule, they overlooked the lingering LinkShare override. A rule intended to be stricter for LinkShare now inadvertently makes LinkShare more permissive.

Reflections on hiding from testers

In a broad sense, the closest analogue to Honey’s tactics is Volkswagen Dieselgate  Recall the 2015 discovery that Volkswagen programmed certain diesel engines to activate their emission controls only during laboratory testing, but not in real-world driving.  Revelation of Volkswagen’s misconduct led to the resignation of Volkswagen’s CEO.  Fines, penalties, settlements, and buyback costs exceeded $33 billion.

In affiliate marketing, numbers are smaller, but defeating testing is, regrettably, more common.  For decades I’ve been tracking cookie-stuffers, which routinely use tiny web elements (1×1 IFRAMEs and IMG tags) to load affiliate cookies, and sometimes further conceal those elements using CSS such as visibility:none.  Invisibility quite literally conceals what occurs.  In parallel, affiliates also deployed additional concealment methods.  Above, I mentioned Dunning and Hogan, who concealed their miscondudct in two additional ways.  First, they stuffed each IP address at most once.  Consider a researcher who suspected a problem, but didn’t catch it the first time.  (Perhaps the screen-recorder and packet sniffer weren’t running.  Or maybe this happened on a tester’s personal machine, not a dedicated test device.)  With a once-per-IP-address rule, the researcher couldn’t easily get the problem to recur.  (Source: eBay complaint, paragraph 27: “… only on those computers that had not been previously stuffed…”)  Second, they geofenced eBay and CJ headquarters. (Source.)  Shawn Hogan even admitted intentionally not targeting the geographic areas where he thought I might go.  Honey’s use of a server-side blacklist allows similar IP filtering and geofencing, as well as more targeted filtering such as always standing down for the specific IPs, cookies, and accounts that previously submitted complaints.

A 2010 blog from affiliate trademark testers BrandVerity uncovered an anti-test strategy arguably even closer to what Honey is doing.  In this period, history sniffing vulnerabilities let web sites see what other pages a user had visited: Set visited versus unvisited links to different colors, link to a variety of pages, and check the color of each link.  BV’s perpetrator used this tactic to see whether a user had visited tools used by affiliate compliance staff (BV’s own login page, LinkShare’s dashboard and internal corporate email, and ad-buying dashboards for Google and Microsoft search ads).  If a user had visited any of these tools, the perpetrator would not invoke its affiliate link—thereby avoiding revealing its prohibited behavior (trademark bidding) to users who were plainly affiliate marketing professionals.  For other users, the affiliate bid on prohibited trademark terms and invoked affiliate links.  Like Honey, this affiliate distinguished normal users from industry insiders based on prior URL visits.  Of course Honey’s superior position, as a browser plugin, lets it directly read cookies without resorting to CSS history.  But that only makes Honey worse.  No one defended the affiliate BV caught, and I can’t envision anyone defending Honey’s tactic here.

In a slightly different world, it might be considered part of the rough-and-tumble world of commerce that Honey sometimes takes credit for referrals that others think should accrue to them.  (In fact, that’s an argument Honey recently made in litigation: “any harm [plaintiffs] may have experienced is traceable not to Honey but to the industry standard ‘last-click’ attribution rules.”)  There, Honey squarely ignores network rules, which require Honey to stand down although MegaLag showed Honey does not.  But if Honey just ignored network stand-down rules, brazenly, it could push the narrative that networks and merchants agreed since, admittedly, they didn’t stop Honey.  By hiding, Honey instead reveals that they know their conduct is prohibited.  When we see networks and merchants that didn’t ban Honey, the best interpretation (in light of Honey’s trickery) is not that they approved of Honey’s tactics, but rather that Honey’s concealment prevented them from figuring out what Honey was doing.  And the effort Honey expended, to conceal its behavior from industry insiders, makes it particularly clear that Honey knew it would be in trouble if it was caught.  Honey’s knowledge of misconduct is precisely opposite to its media response to MegaLag’s video, and equally opposite to its position in litigation.

Five years ago Amazon warned shoppers that Honey was a “security risk.”  At the time, I wrote this off as sour grapes—a business dispute between two goliaths.  I agreed with Amazon’s bottom line that Honey was up to no good, but I thought the real problems with Honey were harm to other affiliates and harm to merchants’ marketing programs, not harms to security.  With the passage of time, and revelation of Honey’s tactics including checking other companies’ cookies and hiding from testers, Amazon is vindicated.  Notice Honey’s excessive permission—which includes letting Honey read users’ cookies at all sites.  That’s well beyond what a shopping assistant truly needs, and it allows all manner of misconduct including, unfortunately, what I explain above.  Security risk, indeed.  Kudos to Amazon for getting this right from the outset.

At VPT, the ad-fraud consultancy, we monitor shopping plugins for abusive behavior.  We hope shopping plugins will behave forthrightly—doing the same thing in our test lab that they do for users.  But we don’t assume it, and we have multiple strategies to circumvent the techniques that bad actors use to trick those monitoring their methods.  We constantly iterate on these approaches as we find new ways of concealment.  And when we catch a shopping plugin hiding from us, we alert our clients not just to their misconduct but also to their concealment—an affirmative indication that this plugin can’t be trusted.  We have scores of historic test runs showing misconduct by Honey in a variety of configurations, targeting dozens of merchants on all the big networks, including both low points and high points, with both screen-cap video and packet log evidence of Honey’s actions.  We’re proud that we’ve been testing Honey’s misconduct for years.

What comes next

I’m looking forward to Honey’s response.  Can Honey leaders offer a proper reason why their product behaves differently when under test, versus when used by normal users?  I’m all ears.

Honey should expect skepticism from Google, operator of the Chrome Web Store.  Google is likely to take a dim view of a Chrome plugin hiding from testers.  Chrome Web Store requires “developer transparency” and specifically bans “dishonest behavior.”  Consider also Google’s prohibition on “conceal[ing] functionality”.  Here, Honey was hiding not from Google staff but from merchants and networks, but this still violates the plain language of Google’s policy as written.

Honey also distributes its Safari extension through the Apple App Store, requiring compliance with Apple Developer Program policies.  Apple’s extension policies are less developed, yet Apple’s broader app review process is notoriously strict.  Meanwhile Apple operates an affiliate marketing program, making it particularly natural for Apple to step into the shoes of merchants who were tricked by Honey’s concealment.  I expect a tough sanction from Apple too.

Meanwhile, class action litigation is ongoing on behalf of publishers who lose marketing commissions when Honey didn’t stand down.  Nothing in the docket indicates that Plaintiff’s counsel know the depths of Honey’s efforts to conceal its stand-down violations.  With evidence that Honey was intentionally hiding from testers, Plaintiffs should be able to strengthen their allegations of both the underlying misconduct and Honey’s knowledge of wrongdoing.  My analysis also promises to simplify other factual aspects of the litigation.  The consolidated class action complaint discusses unpredictability of Honey’s standdown but doesn’t identify the factors that make Honey seem unpredictable—by all indications because plaintiffs (quite understandably) don’t know.  Faced with unpredictability, plaintiffs resorted to monte carlo simulation to analyze the probability that Honey harmed a given publisher in a series of affiliate referrals.  But with clarity on what’s really going on, there’s no need for statistical analysis, and the case gets correspondingly simpler.  The court recently instructed plaintiffs to amend their complaint, and surely counsel will emphasize Honey’s concealment in their next filing.

See also my narrated explainer video.

Notes on hands-on testing methods

Hands-on testing of the relevant scenarios presented immediate challenges.  Most obviously, I needed to test what Honey would do if it had tens of thousands of points, valued at hundreds of dollars.  But I didn’t want to make hundreds or thousands of dollars of test purchases through Honey.

To change the Honey client’s understanding of my points earned to date, I used Fiddler, a standard network forensics tool.  I wrote a few lines of FiddlerScript to intercept messages between the Honey plug-in and the Honey server to report that I had however many points I wanted for a given test.  Here’s my code, in case others want to test themselves:

//buffer responses for communications to/from joinhoney.com
//buffer allows response revisions by Fiddler
static function OnBeforeRequest(oSession: Session) {
  if (oSession.fullUrl.Contains("joinhoney.com"))	{
    oSession.bBufferResponse = true;
  }
}

//rewrite Honey points response to indicate high values 
static function OnBeforeResponse(oSession: Session) {
  if (oSession.HostnameIs("d.joinhoney.com") && oSession.PathAndQuery.Contains("ext_getUserPoints")){
	s = '{"data":{"getUsersPointsByUserId":{"pointsPendingDeposit":67667,"pointsAvailable":98765,"pointsPendingWithdrawal":11111,"pointsRedeemed":22222}}}';
    oSession.utilSetResponseBody(s);
  }
}

This fall, VPT added this method, and variants of it, to our automated monitoring of shopping plugins.

Update (January 6, 2025): VPT announced today that it has 401 videos on file showing Honey stand-down violations as to 119 merchants on a dozen-plus networks.

Traffic Laundering and Referer Faking

In a post on VPT’s blog, I explain how rogue affiliates don’t just conceal their misconduct, but make it look like their traffic is coming from a totally unrelated site.  Surprisingly simple tactics can even manipulate HTTP Referer headers — data that many analysts consider trustworthy since they are set by browsers rather than web sites directly.  Fortunately, VPT automation catches, processes, and even tabulates these violations.  Details: Traffic Laundering and Referer Faking.

Edge Shopping Stand-Down Violations

Affiliate network requirements require shopping plugins to “stand-down”—not present their affiliate links, not even highlight their buttons—when another publisher has already referred a user to a given merchant.  Inexplicably, Microsoft Shopping often does no such thing.

The basic bargain of affiliate marketing is that a publisher presents a link to a user, who (the publisher hopes) clicks, browses, and buys.  But if a publisher can put reminder software on a user’s computer or otherwise present messages within a user’s browser, it gets an extraordinary opportunity for its link to be clicked last, even if another publisher actually referred the user.  To preserve balance and give regular publishers a fair shot, affiliate networks imposed a stand-down rule: If another publisher already referred the user, a publisher with software must not show its notification.  This isn’t just an industry norm; it is embodied in contracts between publishers, networks, and merchants.  (Terms and links below.)

In 2021, Microsoft added shopping features to its Edge web browser.  If a user browses an ecommerce site participating in Microsoft Cashback, Edge Shopping open a notification, encouraging a user to click.  Under affiliate network stand-down rules, this notification must not be shown if another publisher already referred that user to that merchant.  Inexplicably, in dozens of tests over two months, I found the stand-down logic just isn’t working.  Edge Shopping systematically ignores stand-down.  It pops open.  Time.  After.  Time.

This is a blatant violation of affiliate network rules.  From a $3 trillion company, with ample developers, product managers, and lawyers to get it right.  As to a product users didn’t even ask for.  (Edge Shopping is preinstalled in Edge which is of course preinstalled in Windows.)  Edge Shopping used to stand down when required, and that’s what I saw in testing several years ago.  But later, something went terribly wrong.  At best, a dev changed a setting and no one noticed.  Even then, where are the testers?  As a sometimes-fanboy (my first long-distance call was reporting a bug to Microsoft tech support!) and from 2018 to 2024 an employee (details below), I want better.  The publishers whose commissions were taken—their earnings hang in the balance, and not only do they want better, they are suing to try to get it.  (Again, more below.)

Contract provisions require stand-down

Above, I mentioned that stand-down rules are embodied in contract.  I wrote up some of these contract terms in January (there, remarking on Honey violations from a much-watched video by MegaLag).  Restating with a focus on what’s most relevant here (with emphasis added):

Commission Junction Publisher Service Agreement: “Software-based activity must honor the CJ Affiliate Software Publishers Policy requirements… including … (iv) requirements prohibiting usurpation of a Transaction that might otherwise result in a Payout to another Publisher… and (v) non-interference with competing advertiser/ publisher referrals.”

Rakuten Advertising Policies: “Software Publishers must recognize and Stand-down on publisher-driven traffic… ‘Stand-down’ means the software may not activate and redirect the end user to the advertiser site with their Supplier Affiliate link for the duration of the browser session.  … The [software] must stand-down and not display any forms of sliders or pop-ups to prompt activation if another publisher has already referred an end user.”  Stand down must be complete: In a stand-down situation, the publisher’s software “may not operate.”

Impact “Stand-Down Policy Explained”: Prohibits publishers “using browser extensions, toolbars, or in-cart solutions … from interfering with the shopping experience if another click has already been recorded from another partner.”  These rules appear within an advertiser’s “Contracts” “General Terms”, affirming that they are contractual in nature.  Impact’s Master Program Agreement is also on point, prohibiting any effort to “interfere with referrals of End Users by another Partner.”

Awin Publisher Code of Conduct: “Publishers only utilise browser extensions, adware and toolbars that meet applicable standards and must follow “stand-down” rules. … must recognise instances of activities by other Awin Publishers and “stand-down” if the user was referred to the Advertiser site by another Awin Publisher. By standing-down, the Publisher agrees that the browser extension, adware or toolbar will not display any form of overlays or pop-ups or attempt to overwrite the original affiliate tracking while on the Advertiser website.”

Edge does not stand-down

In test after test, I found that Edge Shopping does not stand-down.

In a representative video, from testing on November 28, 2025, I requested the VPN and security site surfshark.com via a standard CJ affiliate link.

Address bar showing affiliate link as start of navigation
From video at 0:01

CJ redirected me to Surfshark with a URL referencing cjdata, cjevent, aff_click_id, utm_source=cj, and sf+cs=cj.  Each of those parameters indicated that this was, yes, an affiliate redirect from CJ to Surfshark .

Arriving at surfshark.com
From video at 0:04

Then Microsoft Shopping popped up its large notification box with a blue button that, when clicked, invokes an affiliate link and sets affiliate cookies.

Edge Shopping pops open its window
From video at 0:08

Notice the sequence: Begin at another publisher’s CJ affiliate link, merchant’s site loads, and Edge Shopping does not stand-down.  This is squarely within the prohibition of CJ’s rules.

Edge sends detailed telemetry from browser to server reporting what it did, and to a large extent why.  Here, Edge simultaneously reports the Surfshark URL (with cjdata=, cjevent=, aff_click_id=, utm_source=cj, and sf_cs=cj parameters each indicating a referral from CJ) (yellow) and also shouldStandDown set to 0 (denoting false/no, i.e. Edge deciding not to stand down) (green).

POST https://www.bing.com/api/shopping/v1/savings/clientRequests/handleRequest HTTP/1.1 
...
{"anid":"","request_body":"{\"serviceName\":\"NotificationTriggering\",\"methodName\":\"SelectNotification\",\"requestBody\":\"{\\\"autoOpenData\\\":{\\\"extractedData\\\":{\\\"paneState\\\":{\\\"copilotVisible\\\":false,\\\"shoppingVisible\\\":false}},\\\"localData\\\":{\\\"isRebatesEnabled\\\":true,\\\"isEdgeProfileRebatesUser\\\":true,\\\"shouldStandDown\\\":0,\\\"lastShownData\\\":null,\\\"domainLevelCooldownData\\\":[],\\\"currentUrl\\\":\\\"https://surfshark.com/?cjdata=MXxOfDB8WXww&cjevent=cb8b45c0cc8e11f0814803900a1eba24&PID=101264606&aff_click_id=cb8b45c0cc8e11f0814803900a1eba24&utm_source=cj&utm_medium=6831850&sf_cs=cj&sf_cm=6831850\\\" ...

With a standard CJ affiliate link, and with multiple references to “cj” right in the URL, I struggle to see why Edge failed to realize this is another affiliate’s referral. If I were writing stand-down code, I would first watch for affiliate links (as in the first screenshot above), but surely I’d also check the landing page URL for significant strings such as source=cj.  Both methods would have called for standing down.

Another notable detail in Edge’s telemetry is that by collecting the exact Surfshark landing page URL, including the PID= parameter (blue), Microsoft receives information about which other publisher’s commission it is taking. Were litigation to require Microsoft to pay damages to the publishers whose commissions it took, these records would give direct evidence about who and how much, without needing to consult affiliate network logs.  This method doesn’t always work—some advertisers track affiliates only through cookies, not URL parameters; others redirect away the URL parameters in a fraction of a second.  But when it works, more than half the time in my experience, it’s delightfully straightforward.

Additional observations

If I observed this problem only once, I might ignore it as an outlier.  But no.  Over the past three weeks, I tested a dozen-plus mainstream merchants from CJ, Rakuten Advertising,  Impact, and Awin, in 25+ test sessions, all with screen recording.  In each test, I began by pasting another publisher’s affiliate link into the Edge address bar.  Time after time, Edge Shopping did not stand-down, and presented its offer despite the other affiliate link.  Usually Edge Shopping’s offer appeared in a popup as shown above.  The main variation was whether this popup appeared immediately upon my arrival at the merchant’s home page (as in the Surfshark example above), versus when I reached the shopping cart (as in the Newegg example below)s.

In a minority of instances, Edge Shopping presented its icon in Edge’s Address Bar rather than opening a popup.  While this is less intrusive than a popup, it still violates the contract provisions (“non-interference”, “may not activate”, “may not operate”, may not “interfere”, all as quoted above).  Turning blue to attract a user’s attention—this invites a user to open Edge Shopping and click its link, causing Microsoft to claim commission that would otherwise flow to another publisher.  That’s exactly what “non-interference” rules out.  “May not operate” means do nothing, not even change appear in the Address Bar. Sidenote: At Awin, uniquely, this seems to be allowed. See Publisher Code of Conduct, Rule 4, guidance 4.2. For Awin merchants, I count a violation only if Edge Shopping auto-opened its popup, not if it merely appeared in the Address Bar.

Historically, some stand-down violations were attributed to tricky redirects.  A publisher might create a redirect link like https://www.nytimes.com/wirecutter/out/link/53437/186063/4/153497/?merchant=Lego which redirects (directly or via additional steps) to an affiliate link and on to the merchant (in this case, Lego).  Historically, some shopping plugins had trouble recognizing an affiliate link when it occurred in the middle of a redirect chain.  This was a genuine concern when first raised twenty-plus years ago (!), when Internet Explorer 6’s API limited how shopping plugins could monitor browser navigation.  Two decades of improvements in browser and plugin architecture, this problem is in the past.  (Plus, for better or worse, the contracts require shopping plugins to get it right—no matter the supposed difficulty.)  Nonetheless, I didn’t want redirects to complicate interpretation of my findings.  So all my tests used the simplest possible approach: Navigate directly to an affiliate link, as shown above.  With redirects ruled out, the conclusion is straightforward: Edge Shopping ignores stand-down even in the most basic conditions.

I mentioned above that I have dozens of examples.  Posting many feels excessive.  But here’s a second, as to Newegg, from testing on December 5, 2025.

Litigation ongoing

Edge’s stand-down violations are particularly important because publishers have pending litigation about Edge claiming commissions that should have flowed to them.  After MegaLag’s famous December 2024 video, publishers filed class action litigation against Honey, Capital One, and Microsoft.  (Links open the respective dockets.)

I have no role in the case against Microsoft and haven’t been in touch with plaintiffs or their lawyers.  If I had been involved, I might have written the complaint and Opposition to Motion to Dismiss differently.  I would certainly have used the term “stand-down” and would have emphasized the governing contracts—facts for some reason missing from plaintiffs’ complaint.

Microsoft’s Motion to Dismiss was fully briefed as of September 2, and the court is likely to issue its decision soon.

Microsoft’s briefing emphasizes that it was the last click in each scenario plaintiffs describe, and claims that last click makes it “entitled to the purchase attribution under last-click attribution.”  Microsoft ignores the stand-down requirements laid out above.  Had Microsoft honored stand-down, it would have opened no popup and presented no affiliate link—so the corresponding publisher would have been the last click, and commission would have flowed as plaintiffs say it should have.

Microsoft then remarks on plaintiffs not showing a “causal chain” from Microsoft Shopping to plaintiffs losing commission, and criticizes plaintiffs’ causal analysis as “too weak.”  Microsoft emphasizes the many uncertainties: customers might not purchase, other shopping plug-ins might take credit, networks might reallocate commission for some other reason.  Here too, Microsoft misses the mark.  Of course the world is complicated, and nothing is guaranteed.  But Microsoft needed only to do what the contracts require: stand-down when another publisher already referred that user in that shopping session.

Later, Microsoft argues that its conduct cannot be tortious interference because plaintiffs did not identify what makes Microsoft’s conduct “improper.”  Let me leave no doubt.  As a publisher participating in affiliate networks, Microsoft was bound by networks’ contracts including the stand-down terms quoted above.  Microsoft dishonored those contracts to its benefit and to publishers’ detriment, contrary to the exact purpose of those provisions and contrary to their plain language.  That is the “improper” behavior which plaintiffs complain about.  In a puzzling twist, Microsoft then argues that it couldn’t “reasonably know[]” about the contracts of affiliate marketing.  But Microsoft didn’t need to know anything difficult or obscure; it just needed to do what it had, through contract, already promised.

Microsoft continues: “In each of Plaintiffs’ examples, a consumer must affirmatively activate Microsoft Shopping and complete a purchase for Microsoft to receive a commission, making Microsoft the rightful commission recipient if it is the last click in that consumer’s purchase journey.”  It is as if Microsoft’s lawyers have never heard of stand-down.  There is nothing “rightful” about Microsoft collecting a commission by presenting its affiliate link in situations prohibited by the governing contracts.

Microsoft might or might not be right that its conduct is acceptable in the abstract.  But the governing contracts plainly rule out Microsoft’s tactics.  In due course maybe plaintiffs will file an amended complaint, and perhaps that will take an approach closer to what I envision.  In any event, whatever the complaint, Microsoft’s motion to dismiss arguments seem to me plainly wrong because Microsoft was required by contract to stand-down—and it provably did not.

***

In June 2025, news coverage remarked on Microsoft removing the coupons feature from Edge (a different shopping feature that recommended discount codes to use at checkout) and hypothesized that this removal was a response to ongoing litigation.  But if Microsoft wanted to reduce its litigation exposure, removing the coupons feature wasn’t the answer.  The basis of litigation isn’t that Microsoft Shopping offers (offered) coupons to users.  The problem is that Microsoft Shopping presents its affiliate link when applicable contracts say it must not.

Catching affiliate abuse

I’ve been testing affiliate abuse since 2004.  From 2004 to 2018, I ran an affiliate fraud consultancy, which caught all manner of abuse—including shopping plugins (what that page calls “loyalty programs”), adware, and cookie-stuffing.  My work in that period included detecting the activity that led to the 2008 litigation civil and criminal charges against Brian Dunning and Shawn Hogan (a fact I can only reveal because an FBI agent’s declaration credited me).  I paused this work from 2018 to 2024, but resumed it this year as Chief Scientist of Visible Performance Technologies, which provides automation to detect stand-down violations, adware, low-intention traffic, and related abuses.  As you’d expect, VPT has long been reporting Edge stand-down violations to clients that contract for monitoring of shopping plugins.

My time from 2018 to 2024, as an employee of Microsoft, is relevant context.  I proposed Bing Cashback and led its product management and business development through launch.  Bing Cashback put affiliate links into Bing search results, letting users earn rebates without resorting to shopping plugins or reminders, and avoiding the policy complexities and contractual restrictions on affiliate software.  Meanwhile, Bing Cashback provided a genuine reason for users to choose Bing over Google.  Several years later, others added cashback to Edge, but I wasn’t involved in that.  Later I helped improve the coupons feature in Edge Shopping.  In this period, I never saw Edge Shopping violate stand-down rules.

I ended work with Bing and Edge in 2022, after which I pursued AI projects until I resigned in 2024.  I don’t have inside knowledge about Edge Shopping stand-down or other aspects of Microsoft Cashback in Edge.  If I had such information, I would not be able to share it.  Fortunately the testing above requires no special information, and anyone with Edge and a screen-recorder can reproduce what I report.

Advertising Fraud Detection at VPT Digital

Today I announced joining the security startup VPT Digital as Chief Scientist.  VPT operates in a space I feel I pioneered: Automated testing to find misconduct in affiliate marketing.  As early as summer 2004 (not a typo!), I was catching affiliates using adware to claim commission they hadn’t earned.  I later built automation to scale up my efforts.

Think affiliate fraud is no big deal?  I was proud to recover large amounts for my clients.  For one large client, I once proved that nine of its top ten biggest affiliates were breaking its rules – which might sound like a disaster, and in some sense it was, but ejecting the rule-breakers yielded ample funds to pay more to those who genuinely drove incremental value.  Affiliate marketing experts may also remember Shawn Hogan and Brian Dunning, who faced both criminal and civil litigation for affiliate fraud – allegations that the FBI said stemmed from reports from me.  Litigation reported that defendants collected more than $20 million in 18 months.  “No big deal,” indeed.

The web is a lot messier than when I started down this path, and tricksters use a remarkable range of methods.  Reviewing VPT’s automation, I’ve been suitably impressed.  They test a range of adware, but also cookie-stuffing, typosquatting, and more.  Of course they test Windows adware and browser plug-ins, but they and have Mac and mobile capabilities too.  They test from multiple geographies, at all times of day.  Their testing is fully automated, yielding spiffy reports in a modern dashboard – plus email alerts and API integration.  It’s all the features I used to dream of building, and then some.

I’ll be working with VPT part-time in the coming months and years to continue to hone their offerings, including making their reports even more accessible to those who don’t want to be experts at affiliate fraud.  I’ll also blog about highlights from their findings.

Honey’s Contractual Breaches and Value (or Lack of It) to Merchants

On December 21, YouTuber MegaLag dropped a 23 minute video eviscerating Honey.  Calling Honey a “scam”, he made two core allegations.

  1. Honey announcing
    Honey claims affiliate commission if a user presses “Got it” to acknowledge no deal found

    Honey takes payments that would otherwise go to influencers who recommended products users buy. (video at 2:50) MegaLag shows Honey claiming payments in four scenarios: i) if a user activates a function to search for coupons (even if none are found), ii) if a user activates a function to claim Honey Gold (no matter how meager the rebate), iii) if the user gets the message “We searched for you but didn’t find any deals” and merely presses the button “Got it”, and iv) If Honey shows the message “Get Rewarded with PayPal” “Shop eligible items to earn cash off future purchases” and the user presses “checkout”.

  2. Honey doesn’t actually get the best deals for users. If a merchant joins Honey (and begins to pay Honey affiliate commissions), Honey allows the merchant to limit which coupons Honey shows to users. MegaLag points out that letting merchants remove discounts from Honey is squarely contrary to Honey’s promise to users that it will find “the Internet’s best discount codes” and “find every working promo code on the Internet.” (video at 16:20)

16 million views and growing, MegaLag’s video has prompted a class action lawsuit and millions of users uninstalling Honey.

I’m a big fan of MegaLag.  I watched most of his other videos, and they’re both informative and useful—for example, testing Apple AirTags by intentionally leaving items to be taken; exploring false claims by DHL about both package status and their supposed investigations.  Meanwhile, nothing in MegaLag’s online profile indicates prior experience in affiliate marketing.  But for a first investigation on this subject, he gets most things right, and he uses many appropriate methods including browser dev tools and screen-capture video.  Based on its size and its practice, Honey absolutely deserves the scrutiny it’s now getting.  Kudos to MegaLag.

Nonetheless there’s a lot MegaLag doesn’t say.  Most notably, he doesn’t mention contracts—the legal infrastructure that both authorizes Honey to get paid and sets constraints on when and how it may operate.  Furthermore, he doesn’t even consider whether merchants get good value for the fees they pay Honey.  In this piece, I explore where I see Honey most vulnerable—both under contract and for merchants looking to spend their marketing funds optimally.

The contracts that bind Honey

Affiliate marketing comprises a web of contracts.  Most affiliate merchants hire a network to track which affiliate sent which traffic, to provide reports to both merchant and publishers, and to handle payments.  For a single affiliate-merchant relationship, an affiliate ends up subject to at least two separate contracts: the network’s standard rules, and any merchant-specific rules.  Of course there are tens of thousands of affiliate merchants, and multiple big networks.  So it’s impossible to make a blanket statement about how all contracts treat Honey’s conduct.  Nonetheless, we can look at some big ones.  Numbering added for subsequent reference.

Commission Junction Publisher Service Agreement

C1 “You must promote Advertisers such that You do not mislead the Visitor”

C2 “the Links deliver bona fide Transactions by the Visitor to Advertiser from the Link”

C3 “You must accurately, clearly and completely describe all promotional methods by selecting the appropriate descriptions and providing additional information when necessary.”

C4 “You agree to: (i) use ethical and legal business practices”

C5 “Software-based activity must honor the CJ Affiliate Software Publishers Policy requirements (as such requirements may be modified from time to time), including but not limited to: (i) installation requirements, (ii) enduser agreement requirements, (iii) afsrc=1 requirements, (iv) requirements prohibiting usurpation of a Transaction that might otherwise result in a Payout to another Publisher (e.g. by purposefully detecting and forcing a subsequent click-through on a link of the same Advertiser) and (v) non-interference with competing advertiser/ publisher referrals.”

Rakuten Advertising Downloadable Software Applications (DSAs) Overview, Testing Process, Policies

R1 “Your DSA should become inactive on the sites of any advertisers who opt-out or stand down on those that do not want you to redirect their traffic.  Publishers who fail to comply with this rule will jeopardize their relationship with advertisers as well as with Rakuten Advertising.”

R2 “[W]e expect your DSA to: Stand down when it recognizes any publisher links”

R3 “[A]ll software must recognize Supplier domains and the linksynergy tracking links. When a Supplier domain or the linksynergy code is detected, the software may not operate or redirect the consumer to the advertiser site using the Software Publisher tracking ID (also known as Supplier Affiliate ID or Encrypted ID). We do not allow any DSA software that interferes with or deters from any Publisher or Advertiser website.”

R4 “The DSA must stand-down and not display any forms of sliders or pop-ups to prompt activation if another publisher has already referred an end user.”

R5 “The DSA must not force clicks or “cookie stuff”. The DSA must not insert a cookie onto the user’s computer without the user knowingly taking an action that results in the cookie being placed.”

R6 “The end user must click through the offer that is presented. Placing the mouse over an offer, only viewing it or viewing all offers is not a click through.”

R7 “The DSA must not automatically drop a cookie when the end user is only viewing offers. The cookie should only be dropped once the end user clicks on a specific offer.”

Awin including ShareASale – Code of Conduct, Awin US Publisher Terms, SAS US Publisher Agreement

A1 “’Click’ means the intentional and voluntary following of a Link by a Visitor as part of marketing services as reported by the Tracking Code only;”

A2 “Publishers only initiate tracking via a tracking link used for click tracking if the user voluntarily and intentionally interacted with the Ad Media or Tracking link.”

A3 Publishers only initiate tracking for a specific advertiser if the consumer interacted directly with ad media for this advertiser.”

A4 ”do not mislead consumers”

A5 “transparency about traffic sources and the environment that ads are displayed in”

In addition, all networks indicate that publishers must disclose their practices to both networks and merchants.  Awin Code of Conduct is representative: “Publishers proactively disclose all promotional activities and obtain advertiser approval for their activities.”  Rakuten’s Testing Process is even more prescriptive, requiring an affiliate both to submit a first version and to notify Rakuten about any changes so it can retest; plus requiring publishers to answer 16 questions about their software including technical details such as DOM ID and Xpath of key functions.

Honey violates network policies

MegaLag’s video show violations of these network policies.  I see three clusters of violations.

(1) Honey invokes its affiliate links although users did not fairly request any such thing.  Consider “We searched for you but didn’t find any deals” with button labeled “Got it” (MegaLag scenario iii above). “Got it” doesn’t indicate that the user wants, expects, or agrees that Honey will invoke its affiliate link.  That’s certainly misleading (contrary to rule C1).  Nor can Honey claim that a user who clicks “Got it” is “knowingly taking an action that results in the cookie being placed” (R5) because clicking “Got it” isn’t the kind of action that rule contemplates.  Rakuten rules R6 and R7 are equally on point, disallowing invoking an affiliate link based on an activity that doesn’t indicate intent (such as a mouseover), and requiring that an affiliate link only be invoked “once the end user clicks on a specific offer.”  “Got it” isn’t an offer, so under R7, that’s not grounds for invoking a Rakuten link.  So too for Awin, where A1 defines “click” to include only links that are “part of marketing services” (but “Got it” is not marketing service).  See also A2 and A3 (allowing links only as part of “ad media”, but “Got it” is not ad media); and of course A4 (“do not mislead consumers”).

Honey’s invocation of affiliate links upon a “Get rewarded with PayPal” message (MegaLag scenario iv above) is on similarly shaky ground.  For example, responding to a PayPal offer is not “knowingly taking an action that results in the cookie being placed” (R5) – the user knows only that he’s closing the message, not that he’s requesting an affiliate referral back to the merchant.  Similarly, a PayPal offer is not “marketing services” or “ad media” for an Awin merchant (rules A1-A3).

The rule to invoke affiliate links only when a user so requests is no mere technicality.  In affiliate marketing, an affiliate may be paid if 1) the user sees a link, 2) the user clicks the link, and 3) the user buys from the specified merchant.  Skipping step 2 sharply increases the circumstances in which a merchant has to pay commission—not a term a merchant would agree to.  When an affiliate skips step 2, it’s cookie-stuffing.  Publishers have gone to jail for this (and had to pay back commissions received).  Honey didn’t quite stuff cookies as that term is usually used—the user did click something.  But when nothing on the button (not its label, not the surrounding message, not any principle of logic or engineering) indicates or even suggests the button will activate an affiliate link—that’s terrible value for the merchant.

(2) Honey presents its affiliate links although a user recently clicked through another publisher’s offer.  (MegaLag at 2:50)  But networks’ rules require Honey to stand down if another publisher has made a referral.  See rule C5.v (“non-interference with competing advertiser/ publisher referrals”) and R2 (“Stand down when it recognizes any publisher links”).  Rakuten even makes explicit that the stand-down obligation applies not just to automatic clicks (which, uh, aren’t permitted in any event) but also to sliders and popups: “The DSA must stand-down and not display any forms of sliders or pop-ups to prompt activation if another publisher has already referred an end user.” (R4)

Here too, this is no technical violation.  Other publishers need “stand down” rules so they have a fair chance to earn commission for their work promoting a given merchant.  Standing down from another affiliate’s click is the most fundamental affiliate network rule for downloadable software and browser plug-ins.

(3) Honey falls short of disclosure obligations.  “You must accurately, clearly and completely describe all promotional methods by selecting the appropriate descriptions and providing additional information when necessary” (C3).  Publishers must provide “transparency about traffic sources and the environment that ads are displayed in” (A5).  I’m open to being convinced that Honey told networks and merchants it would invoke affiliate links with buttons as weakly labeled as “Got it.”  I don’t buy it.  Merchants have a clear contractual basis to expect complete and forthright disclosures—it is literally their money being paid out.  And merchants authorized networks to collect and evaluate these disclosures for them.  No shortcuts.

One might object that networks can waive rules or create exceptions for key partners.  Not so fast!  Merchants and publishers rely on networks to enforce their published rules exactly as promised.  In fact, in 2007, both merchants and publishers sued ValueClick to allege that it had been less than diligent in enforcing its rules.  ValueClick’s Motion to Dismiss argued that it could do what it wanted, that it had disclaimed all warranties, and that it made no promises that merchants or publishers were entitled to rely on.  But the court denied ValueClick’s motion, eventually yielding a settlement requiring both improved efforts to detect affiliate fraud as well as certain refunds to merchants and payments to publishers.  There’s room to disagree about how much benefit the settlement delivered.  (Maybe the settlement promised changes that ValueClick was going to do anyway.  Maybe the monetary payments were a small fraction of the amount lost by merchants and publishers.)  But the fundamental principle was clear: Networks must follow their contractual representations including policies about prohibited behaviors.  And while networks may try to disavow quality responsibilities, for example via disclaimers in contracts, courts are skeptical of the unfettered discretion these provisions purport to create.  A network that promises to track affiliate transactions ultimately ought to do so accurately, and should neither grant arbitrary waivers nor look the other way about serious misconduct.

How did we get here?

Honey’s one-sentence response to MegaLag was “Honey follows industry rules and practices, including last-click attribution.”  It’s no surprise that Honey claims compliance.  But I was surprised to see affiliate thought-leaders agree.  For example, long-time affiliate expert Brook Schaaf remarked “Honey appears to be in compliance with network standards.”  Awin CEO Adam Ross says MegaLag’s video “portray[s] performance marketing attribution as a form of theft or scam”—suggesting that he too thinks Honey did nothing wrong.

I’ll update this piece with when others dig into the contracts and compare Honey’s practices with the governing requirements.  But after more than 20 years working on affiliate fraud—my first piece on this subject was, wow, 2004—let me offer four observations.

One, it’s easy to get complacent.  Much of what Honey does is distressingly normal among browser extensions.  Test the Rakuten Cashback app and you’ll find much the same thing.  Above, I linked to litigation against Honey, but there’s also now similar litigation against Capital One, alleging that its Capital One Shopping browser extension is out of line the same way as Honey.  Brook and Adam are right that Honey’s tactics aren’t a surprise to anyone who’s been in the industry for decades.  Many people have come to accept behaviors that don’t follow the literal meaning of stated policies.  Some would say the policy is out of date.  I’d say, instead, that key decision-makers have been asleep at the switch.

Two, networks’ incentives are mixed.  On one hand, networks want affiliate marketing to be seen as trusted and trustworthy, which requires eliminating practices widely seen as unfair.  At the same time, affiliate networks typically charge a commission on every dollar of commission paid.  As a result, networks directly benefit from anything that increases the number of dollars of commission paid—such as allowing browser plug-ins to change non-commissionable traffic into commissionable traffic.  Merchants should be skeptical of networks too quickly declaring traffic compliant when networks literally get paid for that finding.  With Rakuten operating both a cashback service (with browser plugin) and an affiliate network, their incentives are particularly muddy: If Rakuten Advertising declares a given browser plugin tactic to be permitted, Rakuten Cashback can then use that tactic, increasing both Cashback fees (the Cashback margin on each dollar of rebate) and Advertising fees (the network margin on each dollar of affiliate activity).  I like and respect Rakuten and its leaders, but their complicated incentives mean serious people should give their pronouncements a second look.

Three, most people read the governing contracts hastily if at all.  I’m proud to have pulled out the 17 rules above, and I encourage readers to follow my links to see these and other rules in the larger policy documents.  Fact is, there’s lots of material to digest.  I’ve found that networks’ compliance teams often build rules of thumb that diverge from what the rules actually say, and ignore rules that are in some way seen as inconvenient or overly restrictive.  That’s a mistake.  The rules may not be holy, but they have the force of contract, and there’s real money at issue.  Importantly, networks are spending other people’s money­­­—making sure normal publishers get every dollar they fairly earned; and making sure merchants pay the correct amount, but not a penny more.  This calls for a high level of care.  We’re two weeks into the response to MegaLag.  How many people posted video-responses, blogs, or other remarks without finding, reading, and applying the governing policies?

Four, personalities and work styles invite even merchant staff to accept what Honey is doing.  Representative short-hand: “Go along to get along.”  Most marketers chose this line of work to make connections, not to play policeman.  Attend an affiliate marketing conference and you’re a lot more likely to see DJs and beer (party!) than network sniffers and virtual machines (forensic tools).  Meanwhile, it’s awfully easy for an affiliate manager to tell a boss “we’re working with Honey, the billion-dollar product from PayPal”—then head to the Honey gala at an industry conference.  Conversely, consider the affiliate manager who has to explain “we wasted $50k on Honey last month.”  People have been fired for less.  Ultimately, online marketing plays a procurement function—trying to spend an employer or client’s money as skillfully as possible, to get as much benefit as possible for as little expense as possible.  That’s hard work, and I don’t fault those who want an easier path.  I also don’t fault those who prefer the networking and gala side of marketing over the software forensics.  Nonetheless, collective focus on the fun stuff goes a long way towards explaining how problems can linger (and grow).

Is Honey profitable for merchants?

For a merchant evaluating Honey, the fundamental question is pretty simple: Does Honey bring the merchant incremental sales and positive ROI?  Clearly Honey’s browser extension positions it to claim credit on purchases users were already going to make, but incremental sales are what matter to merchants—purchases made only thanks to Honey.

My hypothesis is that Honey is ROI negative for most merchants.  If a user goes to (say) dell.com, the user is already interested in Dell.  Why should Dell let Honey’s browser plug-in jump in and claim a commission on that user’s purchase?  Maybe Honey will increase the user’s conversion rate from 5% to 5.1% (by proclaiming what a good deal the user has found, or by touting a Honey Gold sweetener).  But with payment to Honey, Dell’s margin will drop from (say) 7% to 5%.  Would Dell prefer 7% profit on 500 sales, or 5% profit on 510?  That math is pretty easy.

Of course the numbers in the preceding paragraph are just hypotheticals.  If users sufficiently trust Honey (whether correctly or otherwise), their conversion rate might increase enough to justify Honey’s fees to merchants.  If Honey could somehow persuade users to spend more—“add one more item to your cart, and you can get this $10 coupon”—that could increase value to merchants too (though I’ve never seen Honey deliver such a message).  Some merchant advisors think this is plausible.  I have my doubts.

Alarmingly, many merchants decide to work with Honey (and other “loyalty” software) without rigorously measuring incrementality (or even trying).  Most merchants take some steps to measure the ROI of search and display ads.  For years, affiliate ROI has been more challenging.  But I recently devised a rigorous method that’s doable for most merchants.  I’d enjoy discussing with anyone interested.  When I have findings from a few merchants, with their permission I’ll share aggregate results.

Looking ahead

It’s easy to watch MegaLag’s piece and come out sour on affiliate marketing.  (“What a mess!”)  For that matter, the affiliate marketing section of my site has 28 articles over 20+ years, almost all about some violation or abuse.

Yet I am fundamentally a fan of affiliate marketing.  Incentives aren’t perfectly aligned between affiliate, network, and merchant, but they’re a whole lot closer than in other kinds of online advertising.  One twist in affiliate is that when a rogue affiliate finds a loophole, they can often exploit it at scale—by some indications, even more so than in other kinds of online advertising.  Hence the special importance of networks and merchants both providing fairness and being perceived as providing fairness.  MegaLag’s critique of Honey shows there’s no shortage of work to do.

The Online Ad Scams Every Marketer Should Watch Out For

The Online Ad Scams Every Marketer Should Watch Out For. HBR Online. October 13, 2015.

Imagine you run a retail store and hire a leafleteer to distribute handbills to attract new customers. You might assess her effectiveness by counting the number of customers who arrived carrying her handbill and, perhaps, presenting it for a discount. But suppose you realized the leafleteer was standing just outside your store’s front door, giving handbills to everyone on their way in. The measured “effectiveness” would be a ruse, merely counting customers who would have come in anyway. You’d be furious and would fire her in an instant. Fortunately, that wouldn’t actually be needed: anticipating being found out, few leafleteers would attempt such a scheme.

In online advertising, a variety of equally brazen ruses drain advertisers’ budgets — but usually it’s more difficult for advertisers to notice them. I’ve been writing about this problem since 2004, and doing my best to help advertisers avoid it.

In this piece for HBR Online, I survey these problems in a variety of types of online advertising — then try to offer solutions.

Risk, Information, and Incentives in Online Affiliate Marketing

Edelman, Benjamin, and Wesley Brandi. “Risk, Information, and Incentives in Online Affiliate Marketing.” Journal of Marketing Research (JMR) 52, no. 1 (February 2015): 1-12. (Lead Article.)

We examine online affiliate marketing programs in which merchants oversee thousands of affiliates they have never met. Some merchants hire outside specialists to set and enforce policies for affiliates, while other merchants ask their ordinary marketing staff to perform these functions. For clear violations of applicable rules, we find that outside specialists are most effective at excluding the responsible affiliates, which we interpret as a benefit of specialization. However, in-house staff are more successful at identifying and excluding affiliates whose practices are viewed as “borderline” (albeit still contrary to merchants’ interests), foregoing the efficiencies of specialization in favor of the better incentives of a company’s staff. We consider the implications for marketing of online affiliate programs and for online marketing more generally.