Donald Trump and his allies relied on misinformation to bolster support for the former US president ahead of the last election. His campaign, meanwhile, turned to deceptive design to bump up donations from unsuspecting Americans, the New York Times reports.
Last September, the paper reveals, when the Trump campaign faced a cash shortage, it leaned on supporters to turn their one-time donations into monthly—and, eventually, weekly—contributions. The problem is the campaign’s website didn’t ask people to opt–in to this enhanced giving schedule, it asked them to opt-out. Trump backers only discovered later that WinRed, the for-profit company that processed Trump campaign payments, was taking hundreds or thousands out of their bank accounts. The Times’ Shane Goldmacher writes:
The tactic ensnared scores of unsuspecting Trump loyalists—retirees, military veterans, nurses and even experienced political operatives. Soon, banks and credit card companies were inundated with fraud complaints from the president’s own supporters about donations they had not intended to make, sometimes for thousands of dollars.
It’s easy to see how so many were misled. They had not noticed that inside a yellow box filled with breathless ad copy (much in screaming ALL CAPS) about voting Trump into office for four more years, there was a single line at the very bottom in unbolded, smaller text that read: “Make this a weekly recurring donation until 11/3.” The box where one would normally click to agree was prechecked. So, too, was a separate box introduced closer to election day that doubled donations and was dubbed a “money bomb.”
Many people requested and received refunds when they recognized what was happening, but some also paid bank fees for their overdrawn accounts, the Times says. For the duped, it may have felt like an egregious scam, but technically it wasn’t illegal at all. What Trump and WinRed did was just spectacularly unethical: They used what’s known as dark patterns, or dark UX (UX is shorthand for user experience) to set up a time-tested trap for followers.
“Dark patterns” that ensnare shoppers are surprisingly common
Harry Brignull the UK-based user-experience consultant who coined the term dark patterns in 2010 to cover all the ways that companies can use colors, images, intentionally difficult to navigate design, and fake urgency said of the pre-checked box move: “It should be in textbooks of what you shouldn’t do.”
Dark patterns are hardly unique to political fundraising. An analysis from Princeton University and the University of Chicago showed that the same sorts of techniques used by the Trump campaign can be found on 11% of all shopping websites, as Quartz reported in 2019.
These visual sleights of hand may include limited-time offers and countdown timers that stir up a sense of FOMO. They could also involve alerts about someone else looking at the same room or buying the same pair of shoes. Gasp!
“The alerts are a type of dark pattern that hijacks our normal tendency to weigh the actions and opinions of others as we make decisions,” Quartz reporters Marc Bain and Amanda Shendruck wrote at the time. “Customer testimonials do the same, and can qualify as a dark pattern if their source isn’t clear.” The customers who appear to be stealing your bargain find from underneath you are not always real people, they warned.
By now, most savvy shoppers know to watch out for free trials that turn into pricey subscriptions because of small print that you didn’t read in your haste to move through your purchase. Brignull calls this type of pattern “forced continuity.”
His website explains that “In some cases, this is made even worse by making it difficult to cancel the membership,” which is actually a separate form of manipulation that Brignull calls a “roach motel,” because it’s easy to get lured in, but impossible to leave. (Have you ever tried to cancel your Amazon membership?)
Other dark patterns can slip purchases into your basket. Or there’s “confirmshaming,” which Brignull describes on his site as “the act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.”
As Recode explained in a recent piece about dark patterns—appropriately published on April 1—sometimes the trickery is hiding in the very place where you think you’ll find freedom from an ad or product, like when the X in the top right-hand corner of a box is so tiny that you can’t see it, or designed so that when you do click on it, you accidentally click on the ad itself.
Will laws catch up with web designers who intentionally trick users?
Another form of deceptive UX leads users to share their private information more widely than they would imagine. In the US, federal lawmakers from both parties have introduced bills to limit that category of dark patterns in the past. The FTC, which would enforce any such regulation, is also investigating dark patterns.
Any new push to outlaw the practices will be complicated, however, because of gray areas, Jennifer King, privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence told Recode. These are “the instances where users of a technology are being constrained in such a way that they can’t exercise complete autonomy, but that they may not be experiencing full manipulation, or perhaps they are being coerced but with a light touch.”
In Europe, a Consumer Rights Directive already blocks companies from the kinds of prechecked boxes that Trump’s campaign used, Goldmacher notes in the Times. Advertisers there cannot force customers to opt-in by making that the default choice. Last month, California also approved regulations that ban dark patterns that have “the substantial effect of subverting or impairing a consumer’s choice to opt-out.”