Since the revelation that Facebook lost control of the data of as many as 87 million users through Trump political consultant Cambridge Analytica, data privacy advocates have called for greater regulation. The way to regulate Facebook, however, isn’t to lobby for a federal change in data privacy laws.
It’s much more likely that Facebook will first face regulation from the states, eventually forcing it to adopt stricter policies systemwide for how it treats people’s data. In developing a strategy to launch such an effort, data privacy activists are looking to an unlikely industry: food.
The food industry in recent years has been forced to adopt de facto national standards on labeling or production practices for genetically modified foods, cage-free eggs, and meat from humanely raised animals once several states—and in one case a single state—passed strict labeling laws. In order to comply with a single state’s privacy law, tech companies would similarly have to adopt nationwide standards, says Ari Waldman, director of the New York Law School’s Innovation Center for Law and Technology.
“Because Congress has shirked its responsibility by failing to pass a comprehensive data privacy law, it’s up to the states to experiment,” Waldman says. “When given the opportunity, people are very interested in their privacy. That means there is a wide open space for advocacy groups to galvanize popular interest in this.”
To be sure, the regulation of data is not completely analogous to the regulation of food—the supply chains are different and one requires the handling of a very physical product. Still, the parallels between the two offer a compelling strategic template that may well inform how new regulation takes shape.
How food got regulated
The food industry offers a perfect template. After years of working unsuccessfully to get Congress to pass legislation regulating how animals are treated before slaughter, animal activists at the Humane Society of the United States (HSUS) in 2006 came up with a new strategy.
The organization lobbied Arizona to get a ballot measure called Prop 204 that would ban intensive confinement of pigs and veal calves. It passed with 62% of the vote. Only 7 million people live in Arizona, but the law had a dramatic ripple effect, says Josh Balk, who oversees HSUS’ farm animal protection initiatives.
“Smithfield—the largest pork producer in the United States—just two months later announced they were getting rid of veal crates,” Balk says. The American Veal Association followed suit shortly thereafter.
A similar story played out in HSUS’ cage-free egg movement. In 2008, the group threw its weight behind a ballot measure in California that would make it illegal to sell meat and eggs that came from animals confined in cages in which they could not turn around freely, lie down, stand up, and fully extend their limbs. The proposal passed with 63% of the vote, essentially mandating egg producers in Iowa to physically reinvent their egg-laying facilities if they wanted to sell to California’s more than 39 million people. A similar measure passed in 2016 in Massachusetts.
Rather than deal with a patchwork of state laws, the country’s largest egg producers have increasingly been redesigning the way they do business, replacing small battery cages with larger operations that allow their animals to move around more freely. Those changes were spurred on by HSUS’ effort to engage with some of the world’s biggest food companies and retailers, such as Walmart and McDonald’s, which have since adopted policies saying they’ll only buy cage-free eggs.
“It just came to the point where we realized Congress wasn’t going to address the issue, and simply put, others will,” Balk said. “Any movement that is looking for change could look to the animal protection field to see how we’ve changed the country for the betterment of animals.”
Activists demanding more transparency about food companies’ use of genetically-modified ingredients took a similar tact. When federal legislation proved out-of-reach, activists mobilized in Vermont—with a population of just 623,000 people—to pass a state law mandating that all food sold there would clearly label whether it was genetically modified.
Faced with the complicated reality that they would have to create a special line of Vermont food packaging to comply with the state’s new law, food companies quickly got together and lobbied to pass a watered-down law in Congress, which superseded the state law but still provided more transparency around the issue. It showed how powerful a small state could be in larger conversation.
System built on ambiguity
Danielle Citron, chairwoman the Electronic Privacy Information Center, a group that focuses on emerging data privacy and civil liberties issues, says the food movement could provide a roadmap for how to create better data privacy laws. She’s published an authoritative look into how state attorneys generals have shaped data privacy law. Tech companies would be particularly averse to a patchwork of state regulations, Citron says.
“We have something to learn perhaps,” she says. “Right now it’s a free-for-all in terms of data collection and usage. We need to set some ground rules. I think we might be in a moment in which we’re going to see some of these efforts.”
Digital data rights are not as clear-cut or tangible as animal rights and the food products people can physically hold in their hands. As ubiquitous as data is in our lives, it remains ambiguous and difficult to grasp on a personal level. In simplest form, much of data is a series of binary characters—ones and zeros that translate into audio, video, text, and geographic tracking information. These constantly zip between laptops, smartphones, and tablets at lightening speed, bouncing against millions of servers around the globe and in massive volumes.
The privacy rights attached to that kind of data are also difficult to understand and preserve because of the companies that deal in data. Places such as Facebook, Snapchat, Twitter, and Google have designed an internet ecosystem that gives people the illusion that their data are safe and that they are in control of it. But the odds are actually stacked against everyday people, says Northeastern University law professor Woodrow Hartzog, who this month published a book on the topic called Privacy’s Blueprint.
“When you download an app, the entire experience is engineered to get you to give permission,” Hartzog tells Quartz. “We think we have control, but in fact our ability to say no or exert meaningful control is slowly eroding.”
So whether it’s feeling reassured at seeing a padlock icon displayed next to every URL search box or clicking tick boxes on Facebook’s own privacy permissions page, the message being transmitted to everyday users is that they have the power to control their data destiny. But a look under the hood shows companies actively subvert that sense of authority. Long permission agreements written in legalese are often peppered with double negatives, privacy buttons are designed to lure people into feeling secure, and it’s all completely legal because there are so few protection laws governing how technology companies use people’s data, Hartzog argues.
“People have a really difficult time assessing risk when it comes to data,” Hartzog says. “So what happens is the entire environment that we work in—our apps and social media—are all engineered to have a sense of safety and a sense of control. Those two concepts drive the entire engine…because if we feel safe and we feel in control we’ll share more.”
While users may believe they are in control of their digital experiences, companies can deploy powerful surveillance technologies and vacuum up details about them—what they look like, what they are talking about, where they are going, and what their routines are—to then sell to other companies who prize such valuable details for marketing purposes. All the while, data privacy rights silently disappear, he says. With the federal government unwilling to act, it increasingly appears the next wave of regulation will emanate from individual states.
“This is really where you can get some meaningful and imaginative privacy regulation,” Hartzog says.
Privacy rules in Texas and Illinois
Several privacy experts pointed to Texas and Illinois. In January, Google deployed a popular feature through its arts and culture app that allowed people to upload a selfie to its server, and in exchange the app would analyze a person’s face and match them with a doppelgänger from well-known artworks. But the feature wasn’t available to people in Texas or Illinois because both places have passed laws around biometric data saying companies have to give certain disclosures to people informing them of how that data will be used.
California has also gotten credit for shaping national data regulation. In 2002, the state became the first to enact a Data Breach Notification Law, which in turn became a model (pdf) for most other US states.
At Carnegie Mellon University, Rahul Telang has watched as researchers have attempted to understand how people feel about data privacy. Perhaps unsurprisingly, people have a difficult time understanding the risks associated with how their data being used.
“If people are directly harmed, especially financially, they are likely to take more action,” Telang says. Whether Facebook’s recent debacle with how political consultancy Cambridge Analytica used data to try and sway opinions in the 2016 US presidential election remains to be seen. According to data collected by the Pew Research Center, about half of Americans say they don’t trust the government or tech companies to protect their data.
Shirking responsibility
For years, internet technology has developed faster than state and federal privacy laws and Silicon Valley companies have largely been in charge of how the data they collect from their customers is used. They have been able to skirt federal regulation largely because of general dysfunction in Washington DC. Almost every expert Quartz interviewed said they’d rather see a stronger US Federal Trade Commission, with more enforcement power to manage the tech companies. It appears that won’t be happening anytime soon.
“I think the point to keep in mind is that a state shouldn’t have to do this,” Waldman says. “A single state should not have to feel that the only way to protect its citizens is to pass a law that would become a de facto national standard.”
But in shirking its responsibility, Congress is forcing the states to take up the cause. And that may eventually have a profound effect on what data companies can take from you and how they use it.