Facebook Cracked Down on Myanmar Military Accounts and Content

20 individuals and organizations have been banned

Facebook is preserving data and content from banned accounts and pages
NatanaelGinting/iStock

Facebook’s efforts to keep its platform safe in Myanmar have accelerated to the point of banning several military officials.

The social network said in a Newsroom post that 20 individuals and organizations have been banned, including Senior General Min Aung Hlaing, commander-in-chief of the armed forces, and the military’s Myawady television network.

A total of 18 Facebook accounts, one Instagram account and 52 pages were included, totaling nearly 12 million followers, and the social network said it is preserving data and content from those accounts and pages.

Facebook provided more details on Hlaing and Myawady: “International experts, most recently in a report by the U.N. Human Rights Council-authorized fact-finding mission on Myanmar, have found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country. And we want to prevent them from using our service to further inflame ethnic and religious tensions. This has led us to remove six pages and six accounts from Facebook—and one account from Instagram—which are connected to these individuals and organizations. We have not found a presence on Facebook or Instagram for all 20 individuals and organizations we are banning.”

In addition, Facebook has removed 46 pages and 12 accounts “for engaging in coordinated inauthentic behavior,” saying that its investigation uncovered use of those pages and accounts to “covertly push the messages of the Myanmar military.”

Earlier this month, Facebook product manager Sara Su revealed in a Newsroom post that the social network created a dedicated team earlier this year across its product, engineering and policy teams to work on issues that are specific to the country, adding that it identified about 52 percent of content that was removed for hate speech in Myanmar, up from 13 percent in the fourth quarter of 2017, due to “investments we’ve made both in detection technology and people.”

Facebook also shared examples below of content that violated and did not violate its community standards, respectively: