Facebook admitted Nov. 5 that its platform was “used to foment division and incite offline violence” in Myanmar. “We agree that we can and should do more,” Alex Warofka, product policy manager at Facebook, said in a statement.
Between May and September 2018, Facebook commissioned an independent assessment of the platform’s impact in Myanmar, including its role in facilitating human rights abuses. Business for Social Responsibility (BSR), a nonprofit organization with expertise in human rights policies, compiled the report.
BSR pointed out that Myanmar’s existing laws and regulations already inhibit freedom of expression, but widespread use of Facebook and social media for “character assassinations, rumor-spreading, and hate speech against minority individuals” enabled further abuses. The issue is exacerbated because many in Myanmar consider Facebook and the internet interchangeable, BSR’s investigation said. Earlier this year, after a United Nations report on the Rohingya crisis called out Facebook for hosting such content, Facebook removed 20 accounts including those belonging to senior government officials.
The Rohingya crisis, called a modern-day genocide and ethnic cleansing, has led to more than 700,000 Rohingya muslims fleeing the country—one of the largest cases of forced migration in recent history. While the 62-page report commissioned by Facebook is a major move by the company in admitting complicity in the conflict, some have also criticized that the investigation was announced right before the US midterm elections, which is sure to dominate news cycles.
Facebook’s role in the Rohingya crisis has slipped under the radar before: A New York Times investigation in October detailed how the breadth of the Myanmar military’s online propaganda campaign against ethnic minorities was still largely undetected. The latest BRS report outlines a series of recommendations to help Facebook combat this, as well as form a framework to prevent future similar incidences.
These proposals advise Facebook to adopt its own standalone human rights policy with formalized internal governance structures, and to share data to help evaluate human rights violations. The report also emphasizes investing in more effective enforcement of the platform’s community standards—this includes hiring even more Burmese staff who understand local issues better and using artificial intelligence to detect “violent and dehumanizing” content. Previously, Facebook had only two Burmese-speaking moderators, but BSR’s report said that Facebook now has 60 Myanmar language experts to review content, and plans to add 40 more by the end of 2018.
Facebook partially blamed the Zawgyi digital type standard, which 90% of Myanmar phones use instead of the global standard Unicode, for making content moderation difficult, as it makes it more difficult for the company to parse content that is not readable by its algorithms.
BSR also highlights that the upcoming 2020 elections in Myanmar will be a turning point, citing one interviewee who said, “With two years advance notice, it is important that Facebook gets it right during election time.” The report added that Facebook should focus on advocating for human rights reform across the southeastern Asian region as a whole, which might in turn benefit policies in Myanmar.