Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

"After devoting nearly two decades to growth at any cost," writes Karr, the Facebook "network has become far too big NOT to fail—even when you consider the resources Facebook is now throwing at filtering and flagging objectionable content. And those failures are having dangerous ramifications in the real world." (Photo: flickr/GostGo/cc)

"After devoting nearly two decades to growth at any cost," writes Karr, the Facebook "network has become far too big NOT to fail—even when you consider the resources Facebook is now throwing at filtering and flagging objectionable content. And those failures are having dangerous ramifications in the real world." (Photo: flickr/GostGo/cc)

Why Facebook Filtering Will Ultimately Fail

We as a society need to do more than hope that Facebook can fix itself.

Timothy Karr

In its content-moderation report released this week, Facebook revealed that it had removed a whopping 3.2-billion fake accounts from March through September 2019.

That’s a lot of disinformation, wrote tech reporter Aditya Srivastava: “To put it in perspective, the number is [nearly] half of living human beings on planet earth.”

Facebook also claims to have removed or labeled 54-million pieces of content flagged as too violent and graphic, 18.5-million items deemed sexual exploitation, 11.4-million posts breaking its anti-hate speech rules, and 5.7 million that violated its bullying and harassment policies.

We need to consider these large-seeming numbers in the context of the social-media giant’s astronomical growth. Facebook, Inc. now hosts so much content that it’s hard to imagine any filtering apparatus that’s capable of sorting it all out.

"Our challenge is to look beyond fixing Facebook to building a new model for social media that doesn’t violate human and civil rights, or undermine democratic norms."

Facebook CEO Mark Zuckerberg admitted to Congress last month that more than 100-billion pieces of content are shared via Facebook entities (including Instagram and WhatsApp) each day.

Bear with me because the math gets interesting.

A hundred-billion pieces of content a day amounts to roughly 18 trillion (with a “t”) pieces of content over the six-month period covered in Facebook’s moderation report.

Or to put it in Srivastava's terms, that’s comparable to having every living individual on Earth posting content to Facebook platforms 13 times each and every day.

That’s where we are today.

Content armies

To filter this tsunami of images and text in search of suspicious posts, Facebook has deployed an army of more than 30,000 content moderators, and is aided in this effort by artificial intelligence designed to flag content that may violate its rules.

It reminds me of my experiences working as a journalist in Hanoi during the pre-internet early 90s.

My minder from Vietnam’s Ministry of Culture and Information told me that the government employed more than 10,000 people skilled in foreign languages to monitor all outgoing and incoming phone calls and faxes, which is how we communicated back then.

Even in an era still dominated by analog communications, the ministry’s many listeners struggled to keep up.

And while I received the occasional knock on the door from ministry apparatchik concerned about a recent phone conversation I had had or a recent article I had published, the vast majority of my work—including stories the government would have frowned on—went through to my editors in Hong Kong and Tokyo without detection.

Fast forward to 2019.

Just last month, a fairly senior Facebook employee told me that the company is shooting for a 99-percent global success rate in flagging content that violates its rules regarding hateful or racist activity. 

But it still has a way to go. And even a 99-percent filter may not be good enough.

Too big NOT to fail

In its most recent report, the social-media giant claimed to have reached an 80-percent “proactive rate”—a metric by which the company measures its ability to flag hateful content before its community of users has done so.

What does this mean? It means a lot of things. One, that Facebook seems to be taking some serious efforts to combat hateful activities and disinformation coursing across its network.

But it also means that despite these efforts, the flood of content remains far too overwhelming to manage; that even at a very high proactive rate, millions of posts that violate the company’s rules would still slip through.

And its average success rate is likely a lot lower in countries where Facebook lacks language and cultural expertise (or some would argue, economic interest) to effectively flag harmful content.

A report released by Avaaz at the end of October seems to confirm this. Avaaz found that Facebook removed only 96 of 213 posts it had flagged for attacking India’s Bengali Muslim community using similar online tactics as those used against Rohingya minorities in Myanmar.

One post came from an elected official calling “Bangladeshi Muslims” those who rape “our mothers and sisters.” Others included calls to “poison” their daughters and legalize female feticide. 

So, again, what does this mean?

It means that even when operating at optimal levels Facebook’s content-moderation schemes will still miss posts that incite people to inflict real offline violence to some of the most vulnerable among us. And its algorithms will still promote content that provokes strongly partisan and divisive reactions among users.

It also means that we as a society need to do more than hope that Facebook can fix itself. After devoting nearly two decades to growth at any cost, the company’s network has become far too big NOT to fail—even when you consider the resources Facebook is now throwing at filtering and flagging objectionable content. And those failures are having dangerous ramifications in the real world. 

Our challenge is to look beyond fixing Facebook to building a new model for social media that doesn’t violate human and civil rights, or undermine democratic norms.

There’s only so much Facebook can do to fix the problem at its core. The solution lies elsewhere, with those of us willing to create a better platform from scratch.


Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Timothy Karr

Timothy Karr

Timothy Karr is the senior director of strategy for Free Press Action Fund, the advocacy organization that fights for everyone’s rights to connect and communicate. Follow him on Twitter: @TimKarr.

We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.

Senate Dems Propose Talking Filibuster for Voting Rights

"If Senate Democrats can find the political courage this moment requires, they have the tools right now to pass voting rights legislation and save our democracy."

Jessica Corbett ·


94% of AZ Progressives Support Primary Challenge if Sinema Kills Voting Rights

"Sinema can betray Arizonans, but there aren't enough pharma lobbyists in the world to keep her in office if she does."

Brett Wilkins ·


Dems Urged to Hold Senate Floor for 'As Long As It Takes'

"Call your senators now and tell them to hold the floor, keep the debate going, and fix or ditch the filibuster to pass democracy reform," says Indivisible.

Jessica Corbett ·


Exxon Net-Zero Plan Called Greenwashing From 'Climate Liar'

"Exxon and Big Oil's whole strategy is to pretend that fossil fuels can be part of the solution so that they can delay the adoption of renewables another year and keep profiting from oil and gas."

Julia Conley ·


Rising US Renewables Expected to Spur Decline in Fracked Gas

"Renewable energy is starting to do to natural gas what natural gas did to coal."

Kenny Stancil ·

Support our work.

We are independent, non-profit, advertising-free and 100% reader supported.

Subscribe to our newsletter.

Quality journalism. Progressive values.
Direct to your inbox.

Subscribe to our Newsletter.


Common Dreams, Inc. Founded 1997. Registered 501(c3) Non-Profit | Privacy Policy
Common Dreams Logo