Facebook has spent this year ratcheting up the number of content moderators it employs across the world after being sent on a hiring spree following a series of accusations that its algorithms alone couldn’t handle the task.
When the company brings on these new workers, it will have roughly 8,750 people focused on moderation, which is more than double Twitter’s entire headcount. While Twitter only has 328 million monthly users in comparison to Facebook’s 2 billion, the two face many of the same content moderation challenges. Twitter also faces consistent problems stemming from rampant automated bot accounts, and is under federal investigation concerning Russia’s use of its platform to interfere with the 2016 election.
In May, as Facebook faced criticism that users were live streaming acts of violence, CEO Mark Zuckerberg announced 3,000 new hires to curtail violence, hate speech, and child exploitation on the platform. At the time he said there were 4,500 people already working to moderate content, a function Facebook internally calls Community Operations. After a federal inquiry into Russia’s political advertising purchases on Facebook, the company announced 1,000 more hires to the team to focus on advertisements, in addition to 250 people that would work specifically on election interference.
Twitter last reported in July that it employs 3,200 people. Unlike Facebook, whose executives have commented on the thousands of employees it’s hired specifically to track and moderate content, Twitter has never said how many people work in its safety division. But the company told Quartz that it has a team in place that reviews content on the site at all times, and it will pull in people from the across the safety department as needed. The company did not provide details on the size of the department.
You could even add in the headcount of Snap, the parent company of the social network Snapchat, which employs roughly 2,600 people including its editorially-focused news team, to Twitter, and Facebook’s moderation team will still be larger than the two networks combined.
Facebook has shown that it can effectively build internal tools that allow employees to slog through large amounts of content. The company previously employed a 15-18 person team to moderate the content that went into its trending news feed. After the team was fired after allegations of suppressing conservative views, algorithms failed worldwide to surface news, instead promoting content inconsistent with the platform’s terms of service, like a man performing sexual acts on a chicken sandwich.
This isn’t to say that more people and good tools will eliminate all Facebook’s problems. After all, it did get into its current troubles while it had 4,500 people moderating content.