

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

A cardboard cutout of Mark Zuckerberg, CEO of Facebook, dressed up as the QAnon Shaman on Thursday, March 25, 2021.
With the CEOs of Twitter, Google, and Facebook set to testify Thursday on the role social media plays in promoting the kinds of misinformation and far-right extremism that sparked the deadly Capitol attack, anti-monopoly experts are urging members of Congress not to allow the executives to divert attention away from their fundamentally nefarious business model that thrives on the spread of dangerous lies.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models."
--Fight Corporate Monopolies
"The tech CEOs want to talk about their content policies and moderation efforts--because they know their core business models are indefensible and toxic," Morgan Harper, senior advisor at Fight Corporate Monopolies, said ahead of the House technology subcommittee hearing, which is scheduled to begin at 12:00 pm ET.
"If lawmakers are serious about fixing these problems," Harper added, "they must focus on structural power and resist the distractions. Anything less would be a massive failure."
Watch the hearing live:
Amid intensifying scrutiny from lawmakers and growing support for forceful antitrust and regulatory action, Facebook, Twitter, and Google in recent months have taken steps purportedly aimed at stemming the flow of misinformation about the coronavirus pandemic, Covid-19 vaccines, elections, and more.
In the aftermath of the January 6 insurrection--fueled by lies that circulated widely on social media--Facebook and Twitter banned former President Donald Trump for being the chief architect and amplifier of those lies.
But Fight Corporate Monopolies and other advocacy groups argue that such self-regulation--by design--does nothing to address the fact the Facebook, Google, and Twitter's profits depend to a significant degree on cultivating outrageous falsehoods and using invasive surveillance advertising to ensure they spread to receptive audiences.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models," said Fight Corporate Monopolies. "Facebook and Google's YouTube generate a substantial portion of their revenue by selling user data to advertisers--which means any social media obsession becomes a profit hub."
Tech CEOs, the group warned, "want to talk about modest regulatory reforms that would allow them to continue operating in largely the same ways they do today. We have seen this misdirection before, after YouTube and Facebook supercharged a conspiracy theory claiming George Floyd's death was faked to reach 1.3 million viewers."
Amnesty Tech's acting deputy director Joe Westby offered a similar critique, noting that "the business model of Big Tech firms like Google and Facebook depends on capturing people's attention to generate ad revenue--to that end, the algorithms that determine what we see on Facebook's newsfeed or Google's YouTube frequently amplify discrimination and inflammatory content."
"These companies appeal to our emotions of fear and anger to keep us staring at our screens," said Westby. "This can have a devastating effect at a population scale, fueling polarization, division, or serious human rights consequences."
Zephyr Teachout, a law professor at Fordham University, expressed hope that House panelists will ask Facebook CEO Mark Zuckerberg "how much money the company made off of QAnon," the far-right conspiracy theory whose adherents played a considerable role in the violent siege of the Capitol earlier this year.
Emma Ruby-Sachs, executive director of SumOfUs, said in a statement Thursday that Facebook, Google, and Twitter's "inability to deal with the violence, hate, and disinformation they promote on their platforms shows that these companies are failing to regulate themselves."
Ahead of the House subcommittee hearing, activists with SumOfUs gathered near the U.S. Capitol and displayed cutouts of tech executives dressed as insurrectionists to stress the role their platforms played in the violent January 6 attack.
"It's no shocker that Facebook failed to tell us about how its technology is being used to manipulate voters and spread harmful misinformation. How many times are we going to be fooled by these profit-hungry monopolies before Congress finally acts?" said Ruby-Sachs. "Letting Facebook decide how it should be regulated is like letting a criminal decide their own sentence."
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
With the CEOs of Twitter, Google, and Facebook set to testify Thursday on the role social media plays in promoting the kinds of misinformation and far-right extremism that sparked the deadly Capitol attack, anti-monopoly experts are urging members of Congress not to allow the executives to divert attention away from their fundamentally nefarious business model that thrives on the spread of dangerous lies.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models."
--Fight Corporate Monopolies
"The tech CEOs want to talk about their content policies and moderation efforts--because they know their core business models are indefensible and toxic," Morgan Harper, senior advisor at Fight Corporate Monopolies, said ahead of the House technology subcommittee hearing, which is scheduled to begin at 12:00 pm ET.
"If lawmakers are serious about fixing these problems," Harper added, "they must focus on structural power and resist the distractions. Anything less would be a massive failure."
Watch the hearing live:
Amid intensifying scrutiny from lawmakers and growing support for forceful antitrust and regulatory action, Facebook, Twitter, and Google in recent months have taken steps purportedly aimed at stemming the flow of misinformation about the coronavirus pandemic, Covid-19 vaccines, elections, and more.
In the aftermath of the January 6 insurrection--fueled by lies that circulated widely on social media--Facebook and Twitter banned former President Donald Trump for being the chief architect and amplifier of those lies.
But Fight Corporate Monopolies and other advocacy groups argue that such self-regulation--by design--does nothing to address the fact the Facebook, Google, and Twitter's profits depend to a significant degree on cultivating outrageous falsehoods and using invasive surveillance advertising to ensure they spread to receptive audiences.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models," said Fight Corporate Monopolies. "Facebook and Google's YouTube generate a substantial portion of their revenue by selling user data to advertisers--which means any social media obsession becomes a profit hub."
Tech CEOs, the group warned, "want to talk about modest regulatory reforms that would allow them to continue operating in largely the same ways they do today. We have seen this misdirection before, after YouTube and Facebook supercharged a conspiracy theory claiming George Floyd's death was faked to reach 1.3 million viewers."
Amnesty Tech's acting deputy director Joe Westby offered a similar critique, noting that "the business model of Big Tech firms like Google and Facebook depends on capturing people's attention to generate ad revenue--to that end, the algorithms that determine what we see on Facebook's newsfeed or Google's YouTube frequently amplify discrimination and inflammatory content."
"These companies appeal to our emotions of fear and anger to keep us staring at our screens," said Westby. "This can have a devastating effect at a population scale, fueling polarization, division, or serious human rights consequences."
Zephyr Teachout, a law professor at Fordham University, expressed hope that House panelists will ask Facebook CEO Mark Zuckerberg "how much money the company made off of QAnon," the far-right conspiracy theory whose adherents played a considerable role in the violent siege of the Capitol earlier this year.
Emma Ruby-Sachs, executive director of SumOfUs, said in a statement Thursday that Facebook, Google, and Twitter's "inability to deal with the violence, hate, and disinformation they promote on their platforms shows that these companies are failing to regulate themselves."
Ahead of the House subcommittee hearing, activists with SumOfUs gathered near the U.S. Capitol and displayed cutouts of tech executives dressed as insurrectionists to stress the role their platforms played in the violent January 6 attack.
"It's no shocker that Facebook failed to tell us about how its technology is being used to manipulate voters and spread harmful misinformation. How many times are we going to be fooled by these profit-hungry monopolies before Congress finally acts?" said Ruby-Sachs. "Letting Facebook decide how it should be regulated is like letting a criminal decide their own sentence."
With the CEOs of Twitter, Google, and Facebook set to testify Thursday on the role social media plays in promoting the kinds of misinformation and far-right extremism that sparked the deadly Capitol attack, anti-monopoly experts are urging members of Congress not to allow the executives to divert attention away from their fundamentally nefarious business model that thrives on the spread of dangerous lies.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models."
--Fight Corporate Monopolies
"The tech CEOs want to talk about their content policies and moderation efforts--because they know their core business models are indefensible and toxic," Morgan Harper, senior advisor at Fight Corporate Monopolies, said ahead of the House technology subcommittee hearing, which is scheduled to begin at 12:00 pm ET.
"If lawmakers are serious about fixing these problems," Harper added, "they must focus on structural power and resist the distractions. Anything less would be a massive failure."
Watch the hearing live:
Amid intensifying scrutiny from lawmakers and growing support for forceful antitrust and regulatory action, Facebook, Twitter, and Google in recent months have taken steps purportedly aimed at stemming the flow of misinformation about the coronavirus pandemic, Covid-19 vaccines, elections, and more.
In the aftermath of the January 6 insurrection--fueled by lies that circulated widely on social media--Facebook and Twitter banned former President Donald Trump for being the chief architect and amplifier of those lies.
But Fight Corporate Monopolies and other advocacy groups argue that such self-regulation--by design--does nothing to address the fact the Facebook, Google, and Twitter's profits depend to a significant degree on cultivating outrageous falsehoods and using invasive surveillance advertising to ensure they spread to receptive audiences.
"False or radicalizing content is not an unfortunate byproduct of the business model. It's core to these corporations' ad-based revenue models," said Fight Corporate Monopolies. "Facebook and Google's YouTube generate a substantial portion of their revenue by selling user data to advertisers--which means any social media obsession becomes a profit hub."
Tech CEOs, the group warned, "want to talk about modest regulatory reforms that would allow them to continue operating in largely the same ways they do today. We have seen this misdirection before, after YouTube and Facebook supercharged a conspiracy theory claiming George Floyd's death was faked to reach 1.3 million viewers."
Amnesty Tech's acting deputy director Joe Westby offered a similar critique, noting that "the business model of Big Tech firms like Google and Facebook depends on capturing people's attention to generate ad revenue--to that end, the algorithms that determine what we see on Facebook's newsfeed or Google's YouTube frequently amplify discrimination and inflammatory content."
"These companies appeal to our emotions of fear and anger to keep us staring at our screens," said Westby. "This can have a devastating effect at a population scale, fueling polarization, division, or serious human rights consequences."
Zephyr Teachout, a law professor at Fordham University, expressed hope that House panelists will ask Facebook CEO Mark Zuckerberg "how much money the company made off of QAnon," the far-right conspiracy theory whose adherents played a considerable role in the violent siege of the Capitol earlier this year.
Emma Ruby-Sachs, executive director of SumOfUs, said in a statement Thursday that Facebook, Google, and Twitter's "inability to deal with the violence, hate, and disinformation they promote on their platforms shows that these companies are failing to regulate themselves."
Ahead of the House subcommittee hearing, activists with SumOfUs gathered near the U.S. Capitol and displayed cutouts of tech executives dressed as insurrectionists to stress the role their platforms played in the violent January 6 attack.
"It's no shocker that Facebook failed to tell us about how its technology is being used to manipulate voters and spread harmful misinformation. How many times are we going to be fooled by these profit-hungry monopolies before Congress finally acts?" said Ruby-Sachs. "Letting Facebook decide how it should be regulated is like letting a criminal decide their own sentence."