Illustrating the outsize and controversial role it now plays in disseminating news and regulating information across the world, social media giant Facebook is under fire this week after human rights activists and journalists accused it of censoring posts about the ongoing crisis, including charges of "ethnic cleansing," now taking place in Myanmar.
The company's decision to label a Rohingya armed group a "dangerous organization" came amid an international outcry against Myanmar's government and military, which has been accused of orchestrating an "ethnic cleansing" of the Rohingya, a Muslim minority that's often called the most persecuted minority group in the world. The Rohingya have been denied citizenship in Myanmar since 1982, and 146,000 have traveled to Bangladesh as refugees in recent weeks. Satellite imagery released by Human Rights Watch last week has shown the group's villages burned.
The Arakan Rohingya Salvation Army (ARSA) has coordinated attacks against police outposts in Myanmar—in defense of the 1.1 million Rohingya living in Myanmar's Rakhine state, it says. This week, after Facebook designated the group as dangerous, it deleted posts by ARSA as well as by journalists who wrote on the social media site about the crisis, saying such posts violate its community standards prohibiting violent content.
"We are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action," a Facebook spokeswoman, Ruchika Budhraja, said in a statement to The Guardian.
But journalists and activists say their posts, which were aimed at drawing attention to the crisis, have been singled out by Facebook.
Rohingya activist Mohammed Rafique, told The Guardian that he was temporarily banned from the site for posting "photos and videos of torture and killings in the Rohingya villages."
Jafar Arakane, a Rohingya refugee, also had his account suspended after posting news about the crisis on his YouTube channel's Facebook page.
Jameel Jaffer, director of the Knight First Amendment Institute at Columbia University, responded on Twitter to Facebook's statement about its process for deleting content.
Meanwhile, other groups involved in the crisis in Myanmar have not been challenged by Facebook in the same way—namely, Myanmar's military, which has an active verified page on the social media site, suggesting that its long history of persecuting the Rohingya has not earned it Facebook's "dangerous organization" label.
Last year, as Common Dreams reported, Facebook was called on by more than 70 organizations to clarify its position on censorship of content, especially as it concerned human rights.
Author J.M. Berger, who has studied extremism and social media, suggested that while Facebook is most likely not taking sides in the conflict between the Rohingya and their oppressors, its targeting of Rohingya groups and activists may point to a bias in Facebook's algorithms against Muslim groups.
1) This is what happens when your content teams/algorithms are primary trained to detect and suspend Muslims.— J.M. Berger (@intelwire) September 18, 2017
2) If algos, this highlights some obvious limitations. Jihadists have been talking about the Rohingya much longer than anyone else...— J.M. Berger (@intelwire) September 18, 2017
3) ...so if you trained an algo on keywords used by, say, ISIS or AQ, it might "learn" that Rohingya is one of their keywords.— J.M. Berger (@intelwire) September 18, 2017