Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

Corporate gatekeepers and big tech monopolists are making it more difficult than ever for independent media to survive. Please chip in today.

Alex Jones, a right-wing radio host and conspiracy theorist, in 2017. (Jay Janner/Austin American-Statesman via AP)

Alex Jones, a right-wing radio host and conspiracy theorist, in 2017. (Jay Janner/Austin American-Statesman via AP)

Alex Jones Is Far From the Only Person Tech Companies Are Silencing

Facebook, Twitter, Apple and other companies routinely silence voices in marginalized communities around the world that struggle to be heard in the first place, replicating their offline repression

David Greene

 by The Washington Post
We're finally having a public debate about the big Internet platforms policing content and suspending accounts. But it's a serious mistake to frame the debate about content moderation around right-wing conspiracy theorist Alex Jones's Infowars and not around the thousands of other moderation decisions that have been made by such online giants as Apple, Facebook, Google-owned YouTube and Spotify.

Internet companies have removed millions of posts and images over the past decade and suspended hundreds, perhaps thousands, of user accounts. These silenced voices span the political spectrum and the globe: Moroccan atheists, women discussing online harassment, ads featuring crucifixes, black and Muslim activists reposting racist messages they received, trans models, drag performers, indigenous women, childbirth images, photos of breast-feeding. Platforms have taken down documentation of war crimes in Syria, Myanmar and Kashmir, arrests in North Dakota and police brutality across the United States.

We should be extremely careful before rushing to embrace an Internet that is moderated by a few private companies by default, one where the platforms that control so much public discourse routinely remove posts and deactivate accounts because of objections to the content. Once systems like content moderation become the norm, those in power inevitably exploit them. Time and time again, platforms have capitulated to censorship demands from authoritarian regimes, and powerful actors have manipulated flagging procedures to effectively censor their political opponents. Given this practical reality, and the sad history of political censorship in the United States, let's not cheer one decision that we might agree with.

Moderation systems operate in secret with no public accountability

Even beyond content moderation's vulnerability to censorship, the moderating process itself, whether undertaken by humans or, increasingly, by software using machine-learning algorithms, is extremely difficult. Awful mistakes are commonplace, and rules are applied unevenly. Company executives regularly reshape their rules in response to governmental and other pressure, and they do so without significant input from the public. Ambiguous "community standards" result in the removal of some content deemed to have violated the rules, while content that seems equally offensive is okay.

Moderation systems operate in secret with no public accountability. We're pretty sure we know why Infowars was suspended , but in many cases the decisions are far more opaque. And unlike Jones, few of the others have powerful multimedia operations with an audience in the millions that will simply move to using an app when other venues become unavailable, as is happening with Infowars. Facebook, Twitter, Apple and other companies routinely silence voices in marginalized communities around the world that struggle to be heard in the first place, replicating their offline repression.

The palpable public momentum for private companies to do something to more actively moderate the content that appears on their sites is worrisome. And it's worrisome even though I share the concerns — about disinformation, extremism, harassment — motivating it.

We must also not lose sight of the fact that while the dominant social media platforms are largely based in the United States, the majority of their users are based outside the country, and the policies the platforms adopt here have outsize influence elsewhere. If content moderation is here to stay, existing human- rights standards provide a framework for policies that companies could and should voluntarily implement to protect their users.

There have been several worthy efforts to articulate a human-rights framing for content moderation. One framework, which the organization where I work, the Electronic Frontier Foundation, played a part in formulating, is found in the Santa Clara Principles . These principles advance three key goals: numbers (companies should publish the number of posts removed and accounts suspended); notice (companies should provide notice and an explanation to each user whose content is removed); and appeal (companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension).

David Kaye, the special rapporteur for the United Nations on the promotion and protection of the right to free expression, recommended in a recent report that private companies — and governments — should as a routine matter consider the impact that content moderation policies have on human rights. He recommends that governments not pressure private companies to implement policies that interfere with people's right to free expression online.

The power that these platforms have over the online public sphere should worry all of us, no matter whether we agree or disagree with a given content decision. A decision by any one of them has a huge effect. Even worse, if other companies move in lock step, a speaker may effectively be forced offline.

Transparency in these companies' ­content-moderation decisions is essential. We must demand that they apply their rules consistently and provide clear, accessible avenues for meaningful appeal.


© 2021 Washington Post
David Greene

David Greene

David Greene is a senior staff attorney for the Electronic Frontier Foundation.

We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.

Omar Leads Charge Against Baby Formula Monopolies Amid US Shortage

Democrats urge the FTC to probe "any unfair or unsustainable practices, like deceptive marketing, price gouging, and stock buybacks, that may be weakening our nutritional formula supply."

Jessica Corbett ·


'Arbitrary, Racist, and Unfair': Judge Blocks Biden From Ending Title 42

"Only the coyotes profiteering off of people seeking protection have reason to celebrate this ill-reasoned ruling," said one migrant rights advocate.

Brett Wilkins ·


'This Is a War' for Democratic Party's Future, Says Sanders of AIPAC's Super PAC

"They are doing everything they can to destroy the progressive movement in this country," said the senator.

Julia Conley ·


Ginni Thomas Pressed Arizona Lawmakers to Reverse Biden's 2020 Win: Report

"Clarence Thomas' continued service on the Supreme Court is a scandalous and appalling breach of judicial ethics," said one observer. "He is implementing the exact same theories that his wife used to try to steal the 2020 election for Trump."

Brett Wilkins ·


Millions More Kids Going Hungry Since GOP, Manchin Killed Expanded Child Tax Credit

"Even brief disruptions in access to food can have lasting consequences," wrote the authors of a new analysis of worsening hunger among U.S. families.

Kenny Stancil ·

Common Dreams Logo