On Dec. 6, 2007, the 23-year-old CEO and founder of Facebook apologized for invading his users’ privacy.
A month earlier, the company had unveiled Beacon, a project that allowed companies to track when someone bought something on its site, and automatically notify that person’s Facebook friends, sometimes even without their knowledge.
What was meant to be a new way for the young company to generate revenue was quickly seen as overstepping the bounds of trust the company had with its users. According to The Wall Street Journal (paywall), Mark Zuckerberg had agonized how to respond for hours, even asking a mentor, Silicon Valley investor Roger McNamee,”Is being a CEO always this hard?”
Zuckerberg ended up crafting a personal note that still lives on on the company’s corporate communications site today. He said, quite clearly:
We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it. While I am disappointed with our mistakes, we appreciate all the feedback we have received from our users. I’d like to discuss what we have learned and how we have improved Beacon.
Today, the company is dealing with the fallout from a similar scandal. On Friday, March 16, the company announced it was suspending Cambridge Analytica, the data company hired by the Trump campaign during the 2016 US presidential election, for mishandling data it acquired years before. Subsequent reports from The New York Times and The Guardian suggest that Cambridge Analytica managed to harvest information on around 50 million Facebook users without their consent, because of the way that Facebook allowed third-party apps to access information on its site.
So far, neither Zuckerberg, nor chief operating officer Sheryl Sandberg—who was hired from Google after the Beacon debacle—have spoken publicly about the issue, or the way Facebook permits data to be handled. Facebook’s stock price has fallen by about 12% since Friday.
Presumably, Zuckerberg will eventually say something. But will it be as blunt or clear as he was a decade ago?
Facebook is now a $13 billion-a-quarter advertising behemoth, employing 25,000 people, with billions of people accessing its apps every day. Its business is intrinsically tied to the problem at hand: Advertisers want information on people to better target their ads. The company’s chief security officer, Alex Stamos, is reportedly leaving the company over the mishandling of foreign intervention and fake news on the site. According to The New York Times, he wanted to be more upfront about what countries have done on Facebook to sway elections through buying ads and promoting posts, but that jarred with the business side of the company.
“The people whose job is to protect the user always are fighting an uphill battle against the people whose job is to make money for the company,” Sandy Parakilas, who worked in privacy at Facebook until 2012, told the Times.
Facebook makes it exceedingly difficult to find and sift through all the various privacy and security settings on its platforms. This is on purpose: It wants you to share as much information with it and third parties as possible because that makes its data better and allows more effective ad targeting then, in turn, generates billions of dollars of revenue for the company.
In his 2007 apology, Zuckerberg also added:
Facebook has succeeded so far in part because it gives people control over what and how they share information. This is what makes Facebook a good utility, and in order to be a good feature, Beacon also needs to do the same. People need to be able to explicitly choose what they share, and they need to be able to turn Beacon off completely if they don’t want to use it.
Facebook lets you do this now, after the fact. After you’ve set up your Netflix account through Facebook because it was easy; after you pulled in your profile photos from Facebook to Tinder because it was easy; after you added that Facebook game that told you your horoscopes because you saw a friend doing it on your feed. Facebook obfuscates what sorts of permissions these apps are getting into your account, and what they can do with any data they see.
Facebook has not succeeded because it gives people control. It has succeeded because it’s easy to use, and everyone else is on it. That’s the network effect. Unfortunately, it’s also allowed purveyors of everything from a ridiculous toothbrush, to snake oil, to hope, to find you so much more easily, too.