Mark Zuckerberg seems to think Facebook needs its own Supreme Court.
As part of a damage-control press tour following the Cambridge Analytica revelations, Zuckerberg spoke to Vox’s Ezra Klein on a podcast that aired April 2. In the interview, Klein and the Facebook founder discuss a claim Zuckerberg made in a separate interview several years earlier: that in many ways, Facebook is more like a government than a company.
On the podcast, Zuckerberg explains that Facebook—unlike many other companies—has to police speech on its platform, determining what is hateful and unacceptable content, and what is not. Also like a government, the company has to set policies for its constituents. The way that system functions now, he admits, is not great:
I think it’s actually one of the most interesting philosophical questions that we face. With a community of more than 2 billion people all around the world, in every different country, where there are wildly different social and cultural norms, it’s just not clear to me that us sitting in an office here in California are best placed to always determine what the policies should be for people all around the world. And I’ve been working on and thinking through: How can you set up a more democratic or community-oriented process that reflects the values of people around the world?
Klein points out that this is more than just an “interesting philosophical question.” Facebook can shut down the dissemination of terrorist propaganda and child pornography, and how it decides to control speech can also affect elections, suppress political opposition, or amplify authoritarian governments. To a certain extent, the company can determine how quickly misinformation and hate—which can translate to real-life violence—spread online.
Despite hiring legions of moderators and beefing up AI systems aimed at detecting banned content, Facebook still struggles to determine what violates its own rules, and has in the past removed examples of legitimate journalism and art, while leaving up conspiracy theories.
Klein, like many others, wonders whether Facebook has become too big to operate with the same structures and financial incentives as a normal company. Zuckerberg, positioning himself as the benevolent ruler of a state-like entity, counters that everything is going to be fine—because ultimately he controls Facebook.
Klein asks about the downsides of that dominant position: “Do you think that governance structure makes you, in some cases, less accountable?”
In response, Zuckerberg offers a mea culpa about the lack of an effective appeals process in Facebook’s content moderation. And then he floats a novel idea:
Over the long term, what I’d really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.
You could also imagine such a structure as a dystopian court of elders ruling over how the world communicates, as an invitation to autocratic governments in search of total control, or as a meaningless PR stunt. Quartz reached out to Facebook for clarification on Zuckerberg’s statement; a spokesperson said the company had nothing to add.
Of course, it’s possible the embattled CEO is simply spitballing. But if Zuckerberg’s statement isn’t just a throwaway soundbite, here are just a few questions he should have in mind:
- Does he envision such a structure in every country where Facebook operates? Or a global body?
- If it would be global, how many members would you need to represent the will of every nation?
- Whether it’s global or country-based, how would you avoid corruption and undue influence?
- Who would choose its members? How long would they be members of the court?
- How would this change Facebook’s values and terms as a company?
- How would this type of body function, who would pay for it?
- Wouldn’t such a system facilitate government censorship, especially if it’s country-based?
- Would it actually solve the problem of hate speech and disinformation rising on the platform?
- Who would adjudicate the adjudicators?