Facebook’s “not a media company,” but it decides whether Palestinian editors can publish or not

Exercising editorial judgment over the masses
Exercising editorial judgment over the masses
Image: Reuters/Dado Ruvic
By
We may earn a commission from links on this page.

Facebook is deciding how the media should do its job again, even though CEO Mark Zuckerberg has insisted it’s “not a media company.” This time, two Palestinian news organizations say one of their Facebook pages, and the personal accounts of seven editors and executives, were suspended, al Jazeera reported. This prevented them from accessing their organizations’ Facebook pages, which collectively have 11 million “likes.”

The news organizations allege that the suspensions were the result of an agreement Facebook struck with Israel earlier this month to monitor incitement to violence on the platform. Facebook says it was a mistake with the way it handles accounts that have been flagged for review.

Facebook reinstated the accounts over the weekend (Sept. 24) and apologized. “The pages were removed in error and restored as soon as we were able to investigate. Our team processes millions of reports each week, and we sometimes get things wrong. We’re very sorry about this mistake,” a Facebook spokesperson said in a statement.

Facebook asks its users to report content and accounts that may violate its “community standards“—a set of rules written by the platform that dictate what’s allowed on it. A team of reviewers then decides whether to take action, which might include issuing a warning, preventing an account from posting, or banning it outright, according to Facebook. A reviewer made a mistake in the case of the Palestinian news publishers, Facebook said.

The system has caused outrage before, as was the case recently in Norway, when it removed posts with a Pulitzer Prize-winning photograph—the “napalm girl” of the Vietnam war—because it depicted a child’s nude form. Facebook reversed its position after Norway’s prime minister got involved.

Facebook is also at the center of a hate-speech furor in Germany, where, under a deal struck last December with the German government (along with Twitter and Google), it is supposed to remove xenophobic and racist posts within 24 hours. An analysis we conducted in March found that this rarely happened.

Social-media platforms are increasingly occupying the role of ”the internet’s editors,” as technology commentator Dan Gillmor has called Facebook. Twitter, for instance, is now weighing whether to allow a Turkish journalist’s tweets to remain visible in the country after receiving a local court order to block it, according to Motherboard. It too at one point called itself “not a media company.” The opaque and sometimes arbitrary ways in which platforms decide who gets to say what to an audience of millions is ironically, perhaps the clearest way to perceive their editorial judgment at work.