Facebook outlined the steps it is taking to prevent its platform from being used as a tool to fuel the ethnic violence occurring in Myanmar.
Product manager Sara Su wrote in a Newsroom post that Facebook created a dedicated team earlier this year across its product, engineering and policy teams to work on issues that are specific to the country.
The social network proactively identified about 52 percent of content that was removed for hate speech in Myanmar, up from 13 percent in the fourth quarter of 2017, due to “investments we’ve made both in detection technology and people,” Su wrote.
She added, “As recently as last week, we proactively identified posts that indicated a threat of credible violence in Myanmar. We removed the posts and flagged them to civil society groups to ensure that they were aware of potential violence.”
Another step Facebook is taking is no longer offering Zawgyi—which is only used to display text in Burmese—as an option for new users and improving font converters for existing users.
Su explained that Unicode is the global industry standard for encoding and displaying fonts, and Unicode supports Burmese and other local Myanmar languages, but more than 90 percent of phones in the country use Zawgyi, meaning that their owners are not able to read websites, posts or instructions in the Facebook Help Center that are displayed through Unicode.
She added, “This will not affect people’s posts, but it will standardize how they see buttons, Help Center instructions and reporting tools in the Facebook application.”
More than 60 Myanmar language experts are part of the team that reviews content that is reported to Facebook as hate speech, and Su expects that number to reach 100 by year-end. In addition to reviewers and reports, Su said Facebook engineers are developing artificial intelligence tools to help identify abusive posts, adding that experts from the social network’s policy and partnership teams are working with civil society and building digital literacy programs for people in Myanmar.
On the fake news front, the social network is working with a network of independent organizations to more quickly identify those posts. Su added that Facebook intends to make this a global initiative, but is initially focusing on “countries where false news has had life-or-death consequences,” including Myanmar, Sri Lanka, India, Cameroon and the Central African Republic.
Organizations and figures in Myanmar that have been banned from Facebook due to hate speech include Wirathu, Thuseitta, Parmaukkha, Ma Ba Tha and the Buddha Dhamma Prahita Foundation, and Su wrote that those entities can no longer have a presence on the social network, nor can people “support, praise or represent them.”
She concluded, “We have a responsibility to fight abuse on Facebook. This is especially true in countries like Myanmar, where many people are using the internet for the first time and social media can be used to spread hate and fuel tension on the ground. The ethnic violence in Myanmar is horrific, and we have been too slow to prevent misinformation and hate on Facebook. This is some of the most important work being done at Facebook. And we know we can’t do it alone—we need help from civil society, other technology companies, journalists, schools, government and, most important of all, members of our community. The weight of this work, and its impact on the people of Myanmar, is felt across the company.”