

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Amnesty International says Big Tech's consolidation of power "has profound implications for human rights, particularly the rights to privacy, nondiscrimination, and access to information."
One of the world's leading human rights groups, Amnesty International, is calling on governments worldwide to "break up with Big Tech" by reining in the growing influence of tech and social media giants.
A report published Thursday by Amnesty highlights five tech companies: Alphabet (Google), Meta, Microsoft, Amazon, and Apple, which Hannah Storey, an advocacy and policy adviser on technology and human rights at Amnesty, describes as "digital landlords who determine the shape and form of our online interaction."
These five companies collectively have billions of active users, which the report says makes them akin to "utility providers."
"This concentration of power," the report says, "has profound implications for human rights, particularly the rights to privacy, nondiscrimination, and access to information."
The report emphasizes the "pervasive surveillance" by Google and Meta, which profit from "harvesting and monetizing vast quantities of our personal data."
"The more data they collect, the more dominant they become, and the harder it is for competitors to challenge their position," the report says. "The result is a digital ecosystem where users have little meaningful choice or control over how their data is used."
Meanwhile, Google's YouTube, as well as Facebook and Instagram—two Meta products—function using algorithms "optimized for engagement and profit," which emphasize content meant to provoke strong emotions and outrage from users.
"In an increasingly polarized context, the report says, "this can contribute to the rapid spread of discriminatory speech and even incitement to violence, which has had devastating consequences in several crisis and conflict-affected areas."
The report notes several areas around the globe where social media algorithms amplified ethnic hatred. It cites past research showing how Facebook's algorithm helped to "supercharge" dehumanizing rhetoric that fueled the ethnic cleansing of the Rohingya in Myanmar and the violence in Ethiopia's Tigray War.
More broadly, it says, the ubiquity of these tech companies in users' lives gives them outsized influence over access to information.
"Social media platforms shape what millions of people see online, often through opaque algorithms that prioritize engagement over accuracy or diversity," it says. "Documented cases of content removal, inconsistent moderation, and algorithmic bias highlight the dangers of allowing a handful of companies to act as gatekeepers of the digital public sphere."
Amnesty argues that international human rights law requires governments worldwide to intervene to protect their people from abuses by tech companies.
"States and competition authorities should use competition laws as part of their human rights toolbox," it says. "States should investigate and sanction anti-competitive behaviours that harm human rights, prevent regulatory capture, and prevent harmful monopolies from forming."
Amnesty also calls on these states to consider the possible human rights impacts of artificial intelligence, which it describes as the "next phase" of Big Tech's growing dominance, with Microsoft, Amazon, and Google alone controlling 60% of the global cloud computing market.
"Addressing this dominance is critical, not only as a matter of market fairness but as a pressing human rights issue," Storey said. "Breaking up these tech oligarchies will help create an online environment that is fair and just."
An advocate who has worked with the ICC said the order "actively undermines international justice efforts and obstructs the path to accountability for communities facing unthinkable horrors."
In a federal court in Maine on Friday, two human rights advocates argued that U.S. President Donald Trump's economic and travel sanctions against International Criminal Court Prosecutor Karim Khan violates their First Amendment rights, because of Trump's stipulation that U.S. citizens cannot provide Khan with any services or material support as long as the sanctions are in place.
The lawsuit was filed by the ACLU on behalf of Matthew Smith, co-founder of the human rights group Fortify Rights, and international lawyer Akila Radhakrishnan.
Trump targeted Khan with the sanctions over his issuing of an arrest warrant for Israeli Prime Minister Benjamin Netanyahu and former Israeli Defense Minister Yoav Gallant, whom he accused of war crimes and crimes against humanity in Gaza.
The plaintiffs argued that stopping U.S. citizens from working with Khan will bring their work investigating other atrocities to a halt.
Smith has provided the ICC with evidence of the forced deportation and genocide of the Rohingya people in Myanmar, but he said he has been "forced to stop helping the ICC investigate horrific crimes committed against the people of Myanmar, including mass murder, torture, and human trafficking."
"This executive order doesn't just disrupt our work—it actively undermines international justice efforts and obstructs the path to accountability for communities facing unthinkable horrors," Smith said in a statement.
"The Trump administration's sanctions may discourage countries, as well as individuals and corporations, from assisting the court, making it harder to bring alleged perpetrators from Israel and other countries to trial."
Charlie Hogle, staff attorney with the ACLU's National Security Project, said it was "unconstitutional" to block the plaintiffs and other humanitarian groups in the U.S. from "doing their human rights work" with the ICC.
Radhakrishnan, who focuses on gender-based violence in Afghanistan, said she was "bringing this suit to prevent my own government from punishing me for trying to hold the Taliban accountable for its systematic violence against women and girls from Afghanistan."
In March, Amnesty International warned that Trump's sanctions would "hinder justice for all victims for whom the [ICC] is a last resort," particularly those in Gaza and the occupied Palestinian territories.
The court "relies on its member states to cooperate in its investigations and prosecutions, including by arresting individuals subject to ICC arrest warrants," said Amnesty. "The Trump administration's sanctions may discourage countries, as well as individuals and corporations, from assisting the court, making it harder to bring alleged perpetrators from Israel and other countries to trial."
"Ultimately, the sanctions will harm all of the ICC's investigations, not just those opposed by the U.S. government," said the group. "They will negatively impact the interests of all victims who look to the court for justice in all the countries where it is conducting investigations, including those investigations the U.S. ostensibly supports—for example in Ukraine, Uganda, or Darfur."
"Rather than learning from its reckless contributions to mass violence in countries including Myanmar and Ethiopia, Meta is instead stripping away important protections that were aimed at preventing any recurrence of such harms."
An expert on technology and human rights and a survivor of the Rohingya genocide warned Monday that new policies adopted by social-media giant Meta, which owns Facebook and Instagram, could incite genocidal violence in the future.
On January 7, Meta CEO Mark Zuckerberg announced changes to Meta policies that were widely interpreted as a bid to gain approval from the incoming Trump administration. These included the replacement of fact-checkers with a community notes system, relocating content moderators from California to Texas, and lifting bans on the criticisms of certain groups such as immigrants, women, and transgender individuals.
Zuckerberg touted the changes as an anti-censorship campaign, saying the company was trying to "get back to our roots around free expression" and arguing that "the recent elections also feel like a cultural tipping point toward, once again, prioritizing speech."
"With Zuckerberg and other tech CEOs lining up (literally, in the case of the recent inauguration) behind the new administration's wide-ranging attacks on human rights, Meta shareholders need to step up and hold the company's leadership to account to prevent Meta from yet again becoming a conduit for mass violence, or even genocide."
However, Pat de Brún, head of Big Tech Accountability at Amnesty International, and Maung Sawyeddollah, the founder and executive director of the Rohingya Students' Network who himself fled violence from the Myanmar military in 2017, said the change in policies would make it even more likely that Facebook or Instagram posts would inflame violence against marginalized communities around the world. While Zuckerberg's announcement initially only applied to the U.S., the company has suggested it could make similar changes internationally as well.
"Rather than learning from its reckless contributions to mass violence in countries including Myanmar and Ethiopia, Meta is instead stripping away important protections that were aimed at preventing any recurrence of such harms," de Brún and Sawyeddollah wrote on the Amnesty International website. "In enacting these changes, Meta has effectively declared an open season for hate and harassment targeting its most vulnerable and at-risk people, including trans people, migrants, and refugees."
Past research has shown that Facebook's algorithms can promote hateful, false, or racially provocative content in an attempt to increase the amount of time users spend on the site and therefore the company's profits, sometimes with devastating consequences.
One example is what happened to the Rohingya, as de Brún and Sawyeddollah explained:
We have seen the horrific consequences of Meta's recklessness before. In 2017, Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims. A United Nations Independent Fact-Finding Commission concluded in 2018 that Myanmar had committed genocide. In the years leading up to these attacks, Facebook had become an echo chamber of virulent anti-Rohingya hatred. The mass dissemination of dehumanizing anti-Rohingya content poured fuel on the fire of long-standing discrimination and helped to create an enabling environment for mass violence. In the absence of appropriate safeguards, Facebook's toxic algorithms intensified a storm of hatred against the Rohingya, which contributed to these atrocities. According to a report by the United Nations, Facebook was instrumental in the radicalization of local populations and the incitement of violence against the Rohingya.
In late January, Sawyeddollah—with the support of Amnesty International, the Open Society Justice Initiative, and Victim Advocates International—filed a whistleblower's complaint against Meta with the Securities and Exchange Commission (SEC) concerning Facebook's role in the Rohingya genocide.
The complaint argued that the company, then registered as Facebook, had known or at least "recklessly disregarded" since 2013 that its algorithm was encouraging the spread of anti-Rohingya hate speech and that its content moderation policies were not sufficient to address the issue. Despite this, it misrepresented the situation to both the SEC and investors in multiple filings.
Now, Sawyeddollah and de Brún are concerned that history could repeat itself unless shareholders and lawmakers take action to counter the power of the tech companies.
"With Zuckerberg and other tech CEOs lining up (literally, in the case of the recent inauguration) behind the new administration's wide-ranging attacks on human rights, Meta shareholders need to step up and hold the company's leadership to account to prevent Meta from yet again becoming a conduit for mass violence, or even genocide," they wrote. "Similarly, legislators and lawmakers in the U.S. must ensure that the SEC retains its neutrality, properly investigate legitimate complaints—such as the one we recently filed, and ensure those who abuse human rights face justice."
The human rights experts aren't the only ones concerned about Meta's new direction. Even employees are sounding the alarm.
"I really think this is a precursor for genocide," one former employee told Platformer when the new policies were first announced. "We've seen it happen. Real people's lives are actually going to be endangered. I'm just devastated."