United Nations officials say that social media has had a “determining role” in anti-Rohingya Muslim violence in Myanmar, which the organization itself has called “ethnic cleansing.” And, “as far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,” said Marzuki Darusman, chairman of the organization’s fact-finding mission on the country, according to Reuters.
In Myanmar, which is still effectively controlled by the military, Facebook is so prevalent that it essentially functions as the entire internet, and is the main source of information for citizens (a local digital marketing agency puts the share of the population on Facebook at about 20%). Because of that, it’s been easy for ultra-nationalists to use the platform to stoke hatred against the Rohingya minority, who have been targeted by government forces, killed by the thousands and driven out of the country.
Social media has “substantively contributed to the level of acrimony and dissension and conflict,” Darusman told reporters on March 12.
The anti-Rohingya propaganda on Facebook stemmed from several sources, most notably ultra-nationalist Buddhist monk Ashin Wirathu, but also government and military accounts, according to the Washington Post.
Robert Huish and Patrick Balazo, researchers at Canada’s Dalhousie University, explained on Quartz how the dissemination of hate in Myanmar functions on Facebook:
Anti-Rohingya content includes explicitly racist political cartoons, falsified images, and staged news reports. This content goes viral, normalizing hate speech and shaping public perception. Violence against Rohingya people is increasingly welcomed, and then celebrated online.
What’s more, Rohingya activists in the country told The Daily Beast last year that the social network had been censoring their posts.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” said UN investigator Yanghee Lee. The fact-finding mission is investigating whether the violence in Myanmar falls under genocide.
Facebook told Quartz that the company has taken several steps to address the situation in Myanmar. Beyond its global effort to bolster its content moderation by hiring more reviewers, it says it routinely removes hate speech content in the country, including Wirathu’s account (although this only happened in late February), and that it has developed and promoted localized guidelines for using Facebook.
“We have invested significantly in technology and local language expertise to help us swiftly remove hate content and people who repeatedly violate our hate speech policies. We take this incredibly seriously and have worked with experts in Myanmar for several years to develop safety resources and counter-speech campaigns,” a spokesperson said. “Of course, there is always more we can do and we will continue to work with local experts to help keep our community safe.”