

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

European Commissioner for Internal Market Thierry Breton speaker at commission headquarters in Brussels, Belgium on June 15, 2023.
With the act now in effect for most platforms, the European Commission and member states "must resist any attempts by Big Tech companies to water down implementation," said one expert.
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According to Agence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politico reported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According to Agence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politico reported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According to Agence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politico reported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."