Sep 20, 2017
Illustrating the outsize and controversial role it now plays in disseminating news and regulating information across the world, social media giant Facebook is under fire this week after human rights activists and journalists accused it of censoring posts about the ongoing crisis, including charges of "ethnic cleansing," now taking place in Myanmar.
The company's decision to label a Rohingya armed group a "dangerous organization" came amid an international outcry against Myanmar's government and military, which has been accused of orchestrating an "ethnic cleansing" of the Rohingya, a Muslim minority that's often called the most persecuted minority group in the world. The Rohingya have been denied citizenship in Myanmar since 1982, and 146,000 have traveled to Bangladesh as refugees in recent weeks. Satellite imagery released by Human Rights Watch last week has shown the group's villages burned.
The Arakan Rohingya Salvation Army (ARSA) has coordinated attacks against police outposts in Myanmar--in defense of the 1.1 million Rohingya living in Myanmar's Rakhine state, it says. This week, after Facebook designated the group as dangerous, it deleted posts by ARSA as well as by journalists who wrote on the social media site about the crisis, saying such posts violate its community standards prohibiting violent content.
"We are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action," a Facebook spokeswoman, Ruchika Budhraja, said in a statement to The Guardian.
But journalists and activists say their posts, which were aimed at drawing attention to the crisis, have been singled out by Facebook.
Rohingya activist Mohammed Rafique, toldTheGuardian that he was temporarily banned from the site for posting "photos and videos of torture and killings in the Rohingya villages."
Jafar Arakane, a Rohingya refugee, also had his account suspended after posting news about the crisis on his YouTube channel's Facebook page.
Jameel Jaffer, director of the Knight First Amendment Institute at Columbia University, responded on Twitter to Facebook's statement about its process for deleting content.
\u201cFacebook's response to claim that it is suppressing reports of ethnic cleansing in Myanmar is highly unsatisfying. https://t.co/84XjujnsLw\u201d— Jameel Jaffer (@Jameel Jaffer) 1505842234
Meanwhile, other groups involved in the crisis in Myanmar have not been challenged by Facebook in the same way--namely, Myanmar's military, which has an active verified page on the social media site, suggesting that its long history of persecuting the Rohingya has not earned it Facebook's "dangerous organization" label.
Last year, as Common Dreams reported, Facebook was called on by more than 70 organizations to clarify its position on censorship of content, especially as it concerned human rights.
Author J.M. Berger, who has studied extremism and social media, suggested that while Facebook is most likely not taking sides in the conflict between the Rohingya and their oppressors, its targeting of Rohingya groups and activists may point to a bias in Facebook's algorithms against Muslim groups.
\u201c1) This is what happens when your content teams/algorithms are primary trained to detect and suspend Muslims.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c2) If algos, this highlights some obvious limitations. Jihadists have been talking about the Rohingya much longer than anyone else...\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c3) ...so if you trained an algo on keywords used by, say, ISIS or AQ, it might "learn" that Rohingya is one of their keywords.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. Join with us today! |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Illustrating the outsize and controversial role it now plays in disseminating news and regulating information across the world, social media giant Facebook is under fire this week after human rights activists and journalists accused it of censoring posts about the ongoing crisis, including charges of "ethnic cleansing," now taking place in Myanmar.
The company's decision to label a Rohingya armed group a "dangerous organization" came amid an international outcry against Myanmar's government and military, which has been accused of orchestrating an "ethnic cleansing" of the Rohingya, a Muslim minority that's often called the most persecuted minority group in the world. The Rohingya have been denied citizenship in Myanmar since 1982, and 146,000 have traveled to Bangladesh as refugees in recent weeks. Satellite imagery released by Human Rights Watch last week has shown the group's villages burned.
The Arakan Rohingya Salvation Army (ARSA) has coordinated attacks against police outposts in Myanmar--in defense of the 1.1 million Rohingya living in Myanmar's Rakhine state, it says. This week, after Facebook designated the group as dangerous, it deleted posts by ARSA as well as by journalists who wrote on the social media site about the crisis, saying such posts violate its community standards prohibiting violent content.
"We are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action," a Facebook spokeswoman, Ruchika Budhraja, said in a statement to The Guardian.
But journalists and activists say their posts, which were aimed at drawing attention to the crisis, have been singled out by Facebook.
Rohingya activist Mohammed Rafique, toldTheGuardian that he was temporarily banned from the site for posting "photos and videos of torture and killings in the Rohingya villages."
Jafar Arakane, a Rohingya refugee, also had his account suspended after posting news about the crisis on his YouTube channel's Facebook page.
Jameel Jaffer, director of the Knight First Amendment Institute at Columbia University, responded on Twitter to Facebook's statement about its process for deleting content.
\u201cFacebook's response to claim that it is suppressing reports of ethnic cleansing in Myanmar is highly unsatisfying. https://t.co/84XjujnsLw\u201d— Jameel Jaffer (@Jameel Jaffer) 1505842234
Meanwhile, other groups involved in the crisis in Myanmar have not been challenged by Facebook in the same way--namely, Myanmar's military, which has an active verified page on the social media site, suggesting that its long history of persecuting the Rohingya has not earned it Facebook's "dangerous organization" label.
Last year, as Common Dreams reported, Facebook was called on by more than 70 organizations to clarify its position on censorship of content, especially as it concerned human rights.
Author J.M. Berger, who has studied extremism and social media, suggested that while Facebook is most likely not taking sides in the conflict between the Rohingya and their oppressors, its targeting of Rohingya groups and activists may point to a bias in Facebook's algorithms against Muslim groups.
\u201c1) This is what happens when your content teams/algorithms are primary trained to detect and suspend Muslims.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c2) If algos, this highlights some obvious limitations. Jihadists have been talking about the Rohingya much longer than anyone else...\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c3) ...so if you trained an algo on keywords used by, say, ISIS or AQ, it might "learn" that Rohingya is one of their keywords.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
Illustrating the outsize and controversial role it now plays in disseminating news and regulating information across the world, social media giant Facebook is under fire this week after human rights activists and journalists accused it of censoring posts about the ongoing crisis, including charges of "ethnic cleansing," now taking place in Myanmar.
The company's decision to label a Rohingya armed group a "dangerous organization" came amid an international outcry against Myanmar's government and military, which has been accused of orchestrating an "ethnic cleansing" of the Rohingya, a Muslim minority that's often called the most persecuted minority group in the world. The Rohingya have been denied citizenship in Myanmar since 1982, and 146,000 have traveled to Bangladesh as refugees in recent weeks. Satellite imagery released by Human Rights Watch last week has shown the group's villages burned.
The Arakan Rohingya Salvation Army (ARSA) has coordinated attacks against police outposts in Myanmar--in defense of the 1.1 million Rohingya living in Myanmar's Rakhine state, it says. This week, after Facebook designated the group as dangerous, it deleted posts by ARSA as well as by journalists who wrote on the social media site about the crisis, saying such posts violate its community standards prohibiting violent content.
"We are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action," a Facebook spokeswoman, Ruchika Budhraja, said in a statement to The Guardian.
But journalists and activists say their posts, which were aimed at drawing attention to the crisis, have been singled out by Facebook.
Rohingya activist Mohammed Rafique, toldTheGuardian that he was temporarily banned from the site for posting "photos and videos of torture and killings in the Rohingya villages."
Jafar Arakane, a Rohingya refugee, also had his account suspended after posting news about the crisis on his YouTube channel's Facebook page.
Jameel Jaffer, director of the Knight First Amendment Institute at Columbia University, responded on Twitter to Facebook's statement about its process for deleting content.
\u201cFacebook's response to claim that it is suppressing reports of ethnic cleansing in Myanmar is highly unsatisfying. https://t.co/84XjujnsLw\u201d— Jameel Jaffer (@Jameel Jaffer) 1505842234
Meanwhile, other groups involved in the crisis in Myanmar have not been challenged by Facebook in the same way--namely, Myanmar's military, which has an active verified page on the social media site, suggesting that its long history of persecuting the Rohingya has not earned it Facebook's "dangerous organization" label.
Last year, as Common Dreams reported, Facebook was called on by more than 70 organizations to clarify its position on censorship of content, especially as it concerned human rights.
Author J.M. Berger, who has studied extremism and social media, suggested that while Facebook is most likely not taking sides in the conflict between the Rohingya and their oppressors, its targeting of Rohingya groups and activists may point to a bias in Facebook's algorithms against Muslim groups.
\u201c1) This is what happens when your content teams/algorithms are primary trained to detect and suspend Muslims.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c2) If algos, this highlights some obvious limitations. Jihadists have been talking about the Rohingya much longer than anyone else...\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
\u201c3) ...so if you trained an algo on keywords used by, say, ISIS or AQ, it might "learn" that Rohingya is one of their keywords.\u201d— \ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal (@\ud835\udddd.\ud835\udde0. \ud835\uddd5\ud835\uddd8\ud835\udde5\ud835\uddda\ud835\uddd8\ud835\udde5 #readoptimal) 1505742296
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.