People say they want to protect their personal information, but new research shows privacy tends to take a backseat to convenience and can easily get tossed out the window for a reward as simple as free pizza.
The study—co-authored by Susan Athey, a senior fellow at the Stanford Institute for Economic Policy Research—provides real-life evidence of a digital privacy paradox: a disconnect between stated privacy preferences and actual privacy choices. And it serves policymakers with some food for thought about how to regulate data sharing without creating more hassles for consumers.
“Generally, people don’t seem to be willing to take expensive actions or even very small actions to preserve their privacy,” Athey says. “Even though, if you ask them, they express frustration, unhappiness, or dislike of losing their privacy, they tend not to make choices that correspond to those preferences.”
In highlighting the distortions in consumer behavior regarding privacy, the findings suggest that safeguards, such as the widespread “Notice and Choice” policies under the Privacy Act of 1974, are not enough.
Athey and her co-authors, Christian Catalini and Catherine Tucker of MIT, clinched a unique opportunity to empirically explore the privacy paradox when MIT launched a project in 2014 to encourage experimentation with Bitcoin by MIT undergraduates.
The researchers examined how 3,108 undergraduates played out their privacy preferences while choosing an online wallet to store and manage the digital currency. Along with the Bitcoin distribution, the students’ privacy preferences were also being measured.
Regardless of varying levels of privacy features, the order of the four wallet options presented upon sign-up seemed to drive many of the participants’ decisions, even when the choice contrasted with their stated privacy preferences, the study found.
For instance, when the bank-like wallet offering the most privacy protection from the public was listed first, 78% of the students selected it. When it was listed second or lower, only 65% chose it.
And it made little difference when researchers provided students with more details of each wallet’s privacy features; the influential effect of the ranking order persisted.
What’s more, students who had expressed stronger preferences for privacy—whether it was privacy from the government, the commercial provider, or the public—essentially behaved no differently than those who said privacy was less of a concern, the study found.
To see whether a small incentive could influence a decision about privacy, researchers offered one group of students a free pizza — as long as they disclosed three friends’ email addresses.
An overwhelming majority of the students chose pizza over protecting their friends’ privacy. Differences in gender or their stated personal sensitivities to privacy did not seem to have any effect on the choice.
People “are willing to relinquish private data quite easily when incentivized to do so,” the study put plainly.
Researchers also gave students an option to add additional encryption to help secure information in setting up their wallets. Though the encryption would not have added a security benefit to future wallet transactions, the offer was meant to test whether the participants were willing to take extra steps to protect their privacy.
About half of the students initially tried to go through the extra step of adding the reassuring feature. Yet only half of that group completed the process, while the rest returned to the easier setup option without the encryption.
Altogether, the experiment results show that “consumers deviate from their own stated preferences regarding privacy in the presence of small incentives, frictions, and irrelevant information,” the study stated.
The findings, released in June by the National Bureau of Economic Research, provide a rare snapshot: The privacy paradox has been widely observed, but empirical evidence from a real-world setting—involving choices with real consequences—has been limited.
The study raised two different policy implications.
Since the findings show consumers’ actions don’t align with what they say, and it’s difficult to gauge a consumer’s true privacy preference, policymakers might question the value of stated preferences.
On the other hand, consumers might need more extensive privacy protections “to protect consumers from themselves” and their willingness to share data in exchange for relatively small monetary incentives.
In any case, as people are quick to give up some privacy for less hassle, regulations should avoid inadvertently sticking consumers with additional effort or a less smooth experience as they make privacy-protective choices, the study stated.
“The big issue is that consumers say they want privacy, but if, for example, a firm introduced better privacy policies, would they actually get more customers? My observation is that generally, the answer is no,” says Athey, who is the Economics Professor of Technology at Stanford Graduate School of Business and has been a consultant at Microsoft Corp. since 2007.
The traditional economic paradigm is that users have full information and they make informed choices. But that dynamic doesn’t hold if consumers do not take the time to really evaluate all the options, she says.
“Then the market provides weaker incentives for firms to really give consumers what they want.”
Consumer laziness may play a role, but Athey also thinks consumers don’t feel they have “meaningful choices” when it comes to how service providers—ranging from social media and email to banking and retail—handle personal data.
For social media, users will gravitate to where their friends are, regardless of privacy policies, Athey explains. At the same time, major email programs all have fairly similar privacy policies, so it’s tough to differentiate them, or understand how much switching to a new provider would actually improve the situation.
And no matter what businesses do with consumer privacy settings, or even if they blunder and anger users by disclosing or losing too much personal information, it appears that consumers will usually stick with them.
Numbness kicks in, too.
Having consumers repeatedly consent to legal privacy terms or confirm their acknowledgement of cookies just trains users to ignore them. In turn, such privacy notices probably have zero impact.
“And I don’t see firms offering consumers really great choices about how long they will retain your data,” Athey says. What would happen, say, if consumers had options among providers of how long their data is stored—10 years, two years, one year or six months?
“We don’t have those kinds of meaningful choices, and the policies we have don’t provide firms any incentive to offer those meaningful choices,” she says.
The study’s findings indicate that the power of placement and navigation ease are consistent with consumer behaviors that tech firms already know well.
“By and large, when you’re on a small screen, the information that is presented most conveniently is the information that you pay attention to,” Athey says.
That’s why it’s important how Facebook ranks its news stories, which apps are at the top of a mobile store, or which web link leads a search query.
“All of these technology intermediaries have a huge impact on what you read, what you consume, and what you buy just by how they present you information,” Athey says.
In light of what we know about consumer behavior, “the way privacy policies both here and in Europe have been designed is pretty ineffective,” she says.
“There’s a role for regulation here, clearly, in this area of privacy and security,” Athey says, “but even beyond telling companies what to do, just making it simpler for consumers to make meaningful choices.”