Social media giant Meta and its Nairobi-based content review subcontractor Sama will have to defend themselves in court over claims of inhumane working conditions after a law suit was filed against them in the Kenyan capital on May 10.
The suit by South African content moderator Daniel Motaung through Nzili and Sumbi Advocates claims that Sama fired him shortly after he tried to form an employee union in 2019, a few months after he was hired.
Motaung is seeking compensation on behalf of current and former content moderators while asking the court to compel the two companies to stop thwarting their efforts to form an employee union. He is also asking for the companies to provide mental health support for content moderators for spending thousands of hours watching graphic video clips—including rape, murder and child abuse—for negligible pay.
Meta’s content moderators view and remove illegal or banned content before users see it
The application also questions Sama’s ‘deceptive’ techniques in its recruitment process, saying the firm advertised for roles such as call center agents, agents, and content moderators without disclosing the actual job descriptions to candidates.
“The varying descriptions for the position of a content moderator are deceptive and designed to trick unsuspecting applicants into unknowingly becoming Facebook Content Moderators,” the application by the law firm reads.
It claimed that Meta and Sama have intentionally created a toxic environment at their Nairobi office, designed to keep the moderators from airing their grievances while using Meta software to track productivity of Sama’s employees.
Speaking to Quartz, a Meta spokesperson said the Silicon Valley based tech behemoth takes serious responsibility of the people who review content for it and requires its partners to provide industry-leading pay, benefits, and support.
“We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect.”
Working conditions for Meta’s content moderators have been under scrutiny
Sama has previously denied that it subjected its employees to a low pay, psychological trauma, violation of privacy rights, and an opaque recruitment process.
After a Time investigation in February exposed how it treats employees, Sama took to a blog on its website to claim that its business had “lifted more than 59,000 individuals out of poverty.”
Two weeks after the article was published, Sama announced that it had increased employee salaries by 30% to 50%.
But the new minimum salary of $439 each month after tax, or around $2.20 per hour for a 9-hour working day, up from around $1.50 per hour still remains way below the industry’s global salary bands.
Content moderation for tech giants is also under fire outside of Africa
When asked in March last year by Democratic Rep. Tom O’Halleran about what the company has done to “increase reviewer capacity,” Meta chief executive Mark Zuckerberg said that for certain types of content—such as hate speech and terrorism—Facebook, Instagram, and WhatsApp rely 95-98% on AI for content moderation.
But in July last year, a California judge approved an $85 million settlement between Facebook and more than 10,000 content moderators who had accused the company of failing to protect them from psychological injuries resulting from their exposure to graphic and violent imagery.
Last December, former TikTok content moderator Candie Frazier hired through a third party company sued the company in California for ‘severe psychological trauma’ after having to watch hours of videos of rape, animal cruelty, cannibalism, and mass murder.