It’s a past sin that won’t go away, and it’s about to make Facebook’s relationship with Washington far more complicated.
Embroiled in scandal over a recent data privacy transgression, the Silicon Valley giant is being forced to revisit a consent decree it signed with the US Federal Trade Commission (FTC) in 2011. The commission this week (March 19) has reportedly opened an investigation to determine whether the social media company violated the terms of that agreement—terms that could have prevented the more recent controversy.
Last week, it was revealed that the company allowed a Cambridge University researcher, Aleksandr Kogan, access to the data of 50 million Facebook users who then provided it to Cambridge Analytica, a political consultant used by Donald Trump’s 2016 presidential campaign to help him win the White House. But back in 2011, Facebook agreed with the FTC that it would not share its users’ data without their expressed consent. If the FTC determines Facebook infringed that agreement through the Cambridge Analytica breach, it could theoretically trigger billions of dollars in fines—and fundamentally reshape how Big Tech is regulated.
“There’s plenty of money on the table to make Facebook nervous and get the FTC interested,” says Eric Goldman, the co-director of the High Tech Law Institute at the Santa Clara University School of Law. Facebook’s user data is key to its success; it’s what advertisers use to target Facebook ads. “What needs to happen is a very thorough investigation and a public airing of the way these platforms function,” said Raj Goyle, a former Kansas state legislator and Congressional candidate, who now is the co-CEO of legal tech platform Bodhala.
The FTC claimed (pdf) in 2011 that Facebook had engaged in “unfair and deceptive practices” by making public the data that its users considered to be private. That claim included the charge that Facebook had inappropriately shared data with advertisers and outside application developers. Developers now say that Facebook was often lax (paywall) about how it shared user data with them and how they could use it.
The social media giant opted to sign a “consent decree” with the FTC, agreeing to not share its users’ data without their consent. It paid no fines.
It’s not entirely clear whether Facebook’s involvement with Cambridge Analytica would constitute a violation. The company insists it did nothing wrong: “We remain strongly committed to protecting people’s information,” Facebook’s deputy chief privacy officer told Politico. “We appreciate the opportunity to answer questions the FTC may have.”
In 2011, Jessica Rich was serving as the deputy director of the FTC’s Bureau of Consumer Protection, where she helped oversee the consent decree and the drawing up of the order. She is now the vice president for advocacy at Consumer Reports. In a conversation with Quartz, Rich described what she considers the three main areas that investigators will examine to determine whether Facebook violated its agreement.
Facebook agreed as part of its settlement to establish and maintain “a comprehensive privacy program” that would address privacy risks and protect people’s information. It also agreed to use an independent third-party professional to audit that program. It’s not clear what this auditor did or did not do.
“I think there would be a strong concern that Facebook’s failure to oversee third-party access to user data fell short of these requirements,” Rich said.
The agreement requires Facebook not misrepresent to users the extent to which their data will be used.
“In the course of all this, I think the FTC will be looking at whether Facebook made any deceptive statements about third-party access to user data and the oversight of the third parties that Facebook provided,” Rich said.
This is part of a larger and more existential legal question. Facebook permitted Kogan access to user data, but whether the company was responsible for its user data once he passed it along to Cambridge Analytica is an open question.
“That could be a hotter spot for Facebook,” said Goldman of Santa Clara. “We have to look and see exactly what Facebook told users. If it was silent—if Facebook didn’t promise one way or the other—it’s not entirely clear whether Facebook would have default responsibility.”
The last major question has to do with a user’s expectation when completing their privacy settings.
“When people agree to the settings, would they have anticipated that Facebook would fail to monitor third party access to their data, and fail to enforce its own policies?” Rich asks.
If the answer is no, that opens up a lot of uncomfortable questions for the social media giant to answer.
If it is determined the company violated the terms of its agreement with the FTC, it would be subject to hefty fines of $40,000 per incident. If 50 million users had their data shipped off to Cambridge Analytica without their knowing, that could theoretically amount to billions of dollars in fines.
Whether such a large fine would damage user trust in the company isn’t clear. Goldman believes it’s unlikely. “Facebook has broken consumer trust many times, that’s nothing new,” he said. “And remarkably, the string of disappointments by Facebook hasn’t receded its number of users and, until recently, hasn’t reduced its usage.”
Internet technology has developed faster than state and federal privacy laws, effectively putting Silicon Valley in charge of how the data it collects from its customers is used. The Obama administration, which also used social media ad targeting in its campaigns, “certainly was not at the top of its game getting ahead of these issues,” said Goyle, who noted he is a Democrat. The last administration’s “cozy relationship” with the tech industry was hardly ideal, he said.
If Congressional lawmakers aren’t willing to simply trust that companies such as Facebook will adequately protect people’s data, they could pass legislation to establish new rules. Passing that kind of sweeping legislation in the current political climate is unlikely, but Democrats and Republicans in Congress are openly suggesting greater regulation might be necessary.
“They just can’t say ‘trust us’ anymore,” Minnesota senator Amy Klobuchar said in an interview on National Public Radio. “We’re going to find out if there’s any meat on the bones.”
Heather Timmons contributed to this report.