Mark Zuckerberg’s aggressive note fails to adequately defend Facebook

Or actually, don’t.
Or actually, don’t.
Image: Reuters/Francois Walschaerts
We may earn a commission from links on this page.

Yesterday evening (Oct. 5), Mark Zuckerberg finally broke his silence on all that’s been going on—or going wrong, rather—at Facebook.

Sharing a note he wrote to Facebook staff on his profile, the co-founder and CEO used the first 100 words to apologize for the worst outage the company has had in years. He dedicated the next 1,000 words to whistleblower Frances Haugen, a former Facebook product manager who revealed herself as the source of recent leaks to the Wall Street Journal, and then answered questions from US lawmakers yesterday.

Zuckerberg isn’t at all happy. He says the company has been “misrepresented,” many of the claims “don’t make sense” and are “deeply illogical,” and that the “false narrative” is based on the “mischaracterization” of research.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote in a note brimming with a lot of emotion and very little empirical evidence.

A brief recap of Haugen’s allegations

Haugen, a former member of Facebook’s civic misinformation team, said the company prioritizes profits over public good.

She called for regulation of the tech giant, citing the company’s behavior during the 2020 US presidential election, its approach to hate speech and misinformation, and the impact of its sister app Instagram on the mental well-being of young women, among other issues.

Haugen, who testified in front of the Senate yesterday, comes armed with a mountain of evidence—she did, after all, copy several internal memos and documents prior to leaving the company.

What Mark Zuckerberg said

In his statement, Zuckerberg criticized not just Haugen but anybody who has lost trust in Facebook’s various platforms—Facebook users and their parents, internet activists, lawmakers who’re comparing Facebook to Big Tobacco, and others—with a series of “if we did X, then why would we do Y”-type statements:

If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space—even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?

“At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That’s just not true,” said the leader of the company that is projected to rack up a $119 billion profit this year.

As for the questions raised around kids’ safety, Zuckerberg also reminded everyone that Facebook has “paused” the Instagram for kids under 13 project. But it only did so after months of backlash.

(Lack of) transparency at Facebook

Zuckerberg made a fair point about not attacking organizations “making an effort to study their impact on the world.” But for all the pretense of transparency, Facebook rarely walks the talk.

Despite Zuckerberg’s claims that big tech can be held accountable via self-published reports and data, Facebook’s own research seems to exclude nuances like linguistic or geographic data. Plus, it doesn’t appear to act upon research.

Additionally, there is little transparency around adtech. The platform’s code interferes with monitoring of its ads and content, journalists and scientists have claimed. In August, Facebook drew ire for revoking the access of New York University researchers studying political ads and Covid-19 misinformation, and even banned their personal accounts.

Last month, there were also concerns that Facebook has not been “fully forthcoming” with the Oversight Board, a $130 million corporate high court that Facebook itself set up in 2020 to serve as an independent check on content moderation decisions.