SOCIAL RESPONSIBILITY

The UK wants new laws to fine Google, Twitter, and Facebook for failing to deal with hate speech

Obsession
Borders
Obsession
Borders

The UK may follow Germany’s lead in dealing with hate speech online. A parliamentary report on hate speech on social media published today calls for “escalating sanctions” that should include “meaningful fines” for companies like Google, Facebook, and Twitter for failing to scrub illegal content within a strict time limit. The report notes Germany’s efforts to do so, which include proposed legislation that could impose fines of up to €50 million ($54.5 million).

Germany’s justice minister Heiko Maas has campaigned since 2015 to take the tech giants to task for hate speech lapses. Such posts have spiked as the country accepted greater numbers of refugees. In December 2015, Maas got the Silicon Valley firms to agree to remove flagged content within 24 hours.

These efforts have had limited impact. A Quartz analysis performed after the companies agreed to the 24-hour cut-off found that dozens of flagged Facebook posts remained online for over 100 days. According to a study by Germany’s justice ministry (paywall) in March, Twitter took down just 1% of hate-speech posts within 24 hours, while Facebook removed 39% on time. YouTube, owned by Google, removed 90% of posts. As a result of the poor results, Maas in March proposed a law that would impose a fine of up to €50 million for failing to remove hate speech quickly enough, and would require firms to put an executive in charge of dealing with complaints. That person could be personally liable for a fine of €5 million if their firms failed to meet mandatory standards.

The companies say they continue to work on hate-speech control measures. But the UK report, produced by the Home Affairs Committee, says the tech giants are “shamefully far” from dealing with illegal or dangerous content on their platforms. “Given their immense size, resources and global reach, it is completely irresponsible of them to fail to abide by the law, and to keep their users and others safe,” the authors, comprised of 11 members of parliament, write. The committee also called for quarterly “performance reports” from the tech companies showing the volume of flagged posts and how they were dealt with.

The report also found that online hate speech has been rising in the UK. Google told the committee that flagged content on YouTube was up 25% year-on-year. Current hate-speech laws are unclear, and often don’t cover social media posts, the report said. The government reported to the committee that in most cases, social media posts didn’t cross the threshold into criminal behavior. But Carl Miller, a director at the think-tank Demos, told the committee that there was confusion over which laws applied to social media posts, and that the law was “incredibly unclear” on where criminality lay.

Critics of the government policy say the recommendations are kneejerk responses that could restrict freedom of speech online, while glossing over the technical challenges the platforms face. “How does [the committee] expect context to be reviewed for more than 1 billion pieces of content every day?” Alec Muffett, a director at the UK non-profit Open Rights Group, and a former security engineer at Facebook, wrote in response to the report.

It’s a puzzle that the world’s most valuable companies must confront as critical elections across Europe hinge on the interrelated issues of immigration and hate speech.

home our picks popular latest obsessions search