

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

In this photo illustration, the Threads logo by Meta is displayed on a smartphone with Twitter logo in the background.
A small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine.
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.