Skip to navigationSkip to content
The entrance sign to Facebook headquarters is seen through two moving buses in Menlo Park, California, on Wednesday, October 10, 2018.
Reuters/Elijah Nouvelage
Maybe it’s time to give up the “hacker way.”
EVERYTHING IN MODERATION

Facebook has been thinking about moderation all wrong

Nikhil Sonnad
By Nikhil Sonnad

Reporter

With 2.3 billion monthly users, Facebook constitutes the world’s biggest single entity—larger than the population of China and the total number of Christians.

But China and Christianity are different from Facebookbecause their members have a sense of shared identity. And Facebook’s deficiencies on this point are directly linked to its failure to ethically moderate content on its platform.

The unseemly nitty-gritty of Facebook moderation is now clearer than ever, thanks to anonymous tipsters who have provided the New York Times (paywall) and, earlier, Motherboard with over a thousand pages of the company’s Byzantine rules on what is and is not allowed on the site. The documents contain PowerPoint slides that convey how Facebook’s arbiters of ethics have ruled, on issues ranging from particular people to world politics, from their offices in California. For example, one set of guidelines declares that posts about a particular religious political party in Pakistan should get extra scrutiny, while another religious party is “benign.”

More than anything, the documents show how badly prepared Facebook is to deal with the hate speech, incitements of violence, and other issues that have plagued it over the past year. There are only 7,500 or so moderators, hired on contract through third parties, policing the millions of posts that go up every minute. These moderators only get about eight to 10 seconds to look at each post, according to the Times. And the moderators often don’t even speak the language of the posts they are reading—they either rely on unreliable Google Translate or, as one moderator explained, just approve these posts by default.

Approving or rejecting billions of posts from billions of users is no doubt a massive challenge. But Facebook’s approach is not making things any easier.

One underlying issue, which has fueled many of the Silicon Valley giant’s problems, is Facebook’s growth-first—rather than user-first—mentality. Its goal of connecting the world at all costs has led it to expand well beyond the scope of what it can keep track of. That’s what leads to cases where, for example, moderators who don’t speak Burmese are deciding whether or not a post in Myanmar constitutes hate speech.

More subtly, though, the growth mentality has turned Facebook primarily into a way to collect users’ attention and data. As people have grown more aware of the company’s privacy issues, as well as the negative association between social media use and happiness, users increasingly view spending time on the platform as a vice—something they should do less of. That’s why a large percentage of Americans have begun taking a break from Facebook or deleting it altogether. Marc Benioff, the CEO of Salesforce, tapped into popular thinking by calling Facebook the “new cigarettes.”

But if Facebook was willing to put users first, it might go a long way toward addressing its problems with both user engagement and moderation. Consider an alternative: Reddit. At 330 million monthly active users, it is much smaller than Facebook. But this still qualifies as “at scale,” as they like to say in Silicon Valley. It is, after all, about the population of the United States.

Reddit has its fair share of problems. It is home to white nationalists, Gamergaters, and other bad internet actors. A lot of it is dedicated to porn. But compared to Facebook, it is relatively in control.

The biggest difference is not that Reddit has some exceptionally wise PowerPoint explaining its guidelines, or that it employs thousands of moderators in-house. It is simply that those 300 million users like being on Reddit and care about keeping it an interesting place to be on the internet. Redditors identify with being on Reddit. Meanwhile, nobody calls themselves a “Facebooker” (unless they work at the company).

This identification with the larger community leads Reddit to self-regulate, not entirely unlike how Christians reprimand each other when they don’t follow the rules of Christianity. Each “subreddit” has a set of rules developed by volunteer moderators. Users can have open discussions about whether or not those rules make sense. And if the rules are broadly agreed upon, the users, as well as the moderators, enforce them.

This works because Reddit has limited its scope to things that the parent organization can control. The vast majority of posts are in English, and broadly tend toward internet-y topics like video games, science, and cute gifs. That allows Reddit’s actual moderation team to recognize when a subreddit starts to get out of hand—they aren’t stuck using Google Translate or reference materials from public policy experts.

The solution to Facebook’s moderation problem, then, is not to hire 50,000 more moderators who speak all kinds of languages. Rather, it needs to turn Facebook into something that people actually want to use and be part of. After that, it can relinquish some power to local communities, which can self-regulate. That reduces the scope of things that Facebook has to weigh in on—rather than try to police every individual post, it can police higher-level community leaders.

But so long as Facebook’s users consider it a necessary evil, rather than a positive experience, its moderation system will be stuck putting out fires. And as the company continues to grow, those fires will only get bigger.

Subscribe to the Daily Brief, our morning email with news and insights you need to understand our changing world.