SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
The Progressive

NewsWire

A project of Common Dreams

For Immediate Release

Social Media Platforms Increase Transparency About Content Removal Requests, But Many Keep Users in the Dark When Their Speech Is Censored, EFF Report Shows

Who Has Your Back Spotlights Good, and Not So Good, Content Moderation Policies

WASHINGTON

San Francisco and Tunis, Tunisia--While social media platforms are increasingly giving users the opportunity to appeal decisions to censor their posts, very few platforms comprehensively commit to notifying users that their content has been removed in the first place, raising questions about their accountability and transparency, the Electronic Frontier Foundation (EFF) said today in a new report.

How users are supposed to challenge content removals that they've never been told about is among the key issues illuminated by EFF in the second installment of its Who Has Your Back: Censorship Edition report. The paper comes amid a wave of new government regulations and actions around the world meant to rid platforms of extremist content. But in response to calls to remove objectionable content, social media companies and platforms have all too often censored valuable speech.

EFF examined the content moderation policies of 16 platforms and app stores, including Facebook, Twitter, the Apple App Store, and Instagram. Only four companies--Facebook, Reddit, Apple, and GitHub--commit to notifying users when any content is censored and specifying the legal request or community guideline violation that led to the removal. While Twitter notifies users when tweets are removed, it carves out an exception for tweets related to "terrorism," a class of content that is difficult to accurately identify and can include counter-speech or documentation of war crimes. Notably, Facebook and GitHub were found to have more comprehensive notice policies than their peers.

"Providing an appeals process is great for users, but its utility is undermined by the fact that users can't count on companies to tell them when or why their content is taken down," said Gennie Gebhart, EFF associate director of research, who co-authored the report. "Notifying people when their content has been removed or censored is a challenge when your users number in the millions or billions, but social media platforms should be making investments to provide meaningful notice."

In the report, EFF awarded stars in six categories, including transparency reporting of government takedown requests, providing meaningful notice to users when content or accounts are removed, allowing users to appeal removal decisions, and public support of the Santa Clara Principles, a set of guidelines for speech moderation based on a human rights framework. The report was released today at the RightsCon summit on human rights in the digital age, held in Tunis, Tunisia.

Reddit leads the pack with six stars, followed by Apple's App Store and GitHub with five stars, and Medium, Google Play, and YouTube with four stars. Facebook, Reddit, Pinterest and Snap each improved their scores over the past year since our inaugural censorship edition of Who Has Your Back in 2018. Nine companies meet our criteria for transparency reporting of takedown requests from governments, and 11 have appeals policies, but only one--Reddit--discloses the number of appeals it receives. Reddit also takes the extra step of disclosing the percentage of appeals resolved in favor of or against the appeal.

Importantly, 12 companies are publicly supporting the Santa Clara Principles, which outline a set of minimum content moderation policy standards in three areas: transparency, notice, and appeals.

"Our goal in publishing Who Has Your Back is to inform users about how transparent social media companies are about content removal and encourage improved content moderation practices across the industry," said EFF Director of International Free Expression Jillian York. "People around the world rely heavily on social media platforms to communicate and share ideas, including activists, dissidents, journalists, and struggling communities. So it's important for tech companies to disclose the extent to which governments censor speech, and which governments are doing it."

For the report:
https://www.eff.org/wp/who-has-your-back-2019

For more on platform censorship:
https://www.eff.org/deeplinks/2019/05/

The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. EFF's mission is to ensure that technology supports freedom, justice, and innovation for all people of the world.

(415) 436-9333