Mar 30, 2022
Despite pledging to fight climate disinformation, Facebook's algorithm continues to amplify deceptive and misleading content--pushing users who express doubts about anthropogenic climate change toward outright denial, according to a new report released this week by Global Witness.
"Our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality."
As the United Kingdom-based human rights group pointed out, Facebook's profit-maximizing business model seeks to increase the amount of time people spend "scrolling, liking, and sharing content" because the longer a person stays on the platform, "the more ads they can show that user, thus generating more revenue."
"But it's not just the ads that are tailored to your profile," wrote Global Witness. "Based on your likes, characteristics, behaviors, and demographics, Facebook's algorithm will serve you content that it thinks you will enjoy, (to keep you on the platform) and thus the circle continues."
Facebook announced last year that it would begin attaching "informational labels" to some posts about climate change, directing users to the platform's new "Climate Science Center," which it says connects people "with science-based news, approachable information, and actionable resources" from leading environmental organizations.
To test the efficacy of Facebook's stated commitment to curbing climate disinformation, researchers created two users--"Jane," a so-called climate skeptic, and "John," who accepts the consensus view, held by over 99.9% of the world's scientists, that human activity is propelling contemporary climate change--and observed the quality of information they saw regarding the life-threatening reality of the fossil fuel-driven planetary emergency.
"When we simulated the experience of a climate skeptic user on the platform, within a few clicks Facebook's algorithm recommended content that denied the existence of man-made climate warming and attacked measures aimed at mitigating the climate crisis," wrote Global Witness. "Much of this content deployed culture war tactics to polarize debate around climate change and demonize environmental movements."
After creating a brand-new Facebook account for Jane, researchers had the climate skeptic's profile "like" the Global Warming Policy Forum, which recently changed its name to Net Zero Watch, a page that belongs to a U.K.-based science-denying organization with ties to Steve Baker, a right-wing Tory lawmaker.
Right away, Facebook's algorithm presented Jane with a slew of recommended pages it thought she would enjoy. Global Witness looked at the first three climate-related pages and according to the group--which graded the first nine posts on each page using a classification system it developed to categorize different types of climate content--all but one was "dedicated to climate disinformation."
After Jane "liked" Facebook's first recommendation, a page called Climate Depot--a U.S.-based outlet run by Marc Morano, the communications director of a conservative nonprofit that calls human-caused climate change a "myth"--she was given another "menu of recommendations that Facebook's algorithm thought, based on her existing likes, she would find engaging."
"Of the 27 pieces of content that we graded across the first three recommended climate pages, 100% of it was climate disinformation that either fell into 'distract and delay' or denial categories," wrote researchers. They added:
Eager to see if this was just a one-off, we replicated this test with the same account to see what kinds of climate content would be amplified to Jane, whose online behavior thus far demonstrated an interest in climate disinformation. Using two different anti-climate pages as our starting points, we followed the pages recommended to Jane to understand at what point, if at all, Facebook's algorithm would intercept the bad information with reputable climate science information.
In total we traced Jane's trajectory from three starter pages leading to 18 recommended pages and graded 189 pieces of content. The results?
Of the 18 pages recommended to Jane, only one did not contain any climate disinformation. Twelve of them only included climate disinformation. Of the content we analyzed, only 22% of climate disinformation posts possessed a Climate Science Center flag. As a sub-category, only 34% of climate denial content possessed a flag.
Moreover, while Jane was sporadically and infrequently being referred to the Center, she was being actively encouraged to follow and like pages that almost exclusively espoused climate disinformation.
In response to Global Witness' investigation, Facebook said that "for several months after we announced the initial experiment of informational labels in the U.K., we did not completely roll out our labeling program."
Echoing previous studies of Facebook's massive failure to flag and reduce the spread of climate lies after promising to do so, Global Witness found that the social media giant continues to recommend pages with content promoting the following views:
- The climate crisis is a hoax;
- Rising temperatures are part of natural cycles;
- Environmentalists are alarmists;
- Climate scientists are biased;
- Warming models are inaccurate; and
- Mitigation solutions won't work or are otherwise bad for society.
"This disinformation has consequences," wrote Global Witness. "Research shows that climate disinformation is a primary contributor to public polarization over the climate crisis and that it shapes public attitudes toward climate science. Individuals who are exposed to this kind of disinformation are less likely to support mitigation policies, hindering the ability of policymakers to take meaningful climate action."
In sharp contrast to Jane, experimental Facebook user John "liked" the Intergovernmental Panel on Climate Change's page, after which "every page recommended to John by the algorithm encouraged him to engage with more reliable climate science content," researchers noted.
"Self-regulation is not working."
"The split-screen realities between Jane and John's experience on the very same platform shows the radicalizing effect of Big Tech," they continued. "Facebook's algorithm is ultimately ensuring that the people most in need of good information are the ones least likely to get it."
Jane, an on-the-fence climate skeptic, they added, "was directed to worse information, so that what began on a page full of distract and delay narratives, ended on pages espousing outright climate denial and conspiracy."
Some of the content Jane saw described the United Nations as an "authoritarian regime" that has less credibility than "Bugs Bunny" and accused the "green movement" of "enslaving humanity."
"Facebook has repeatedly said it wants to combat climate disinformation on its platform, but our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality," Mai Rosner, Digital Threats to Democracy campaigner at Global Witness, said in a statement. "Facebook is not just a neutral online space where climate disinformation exists; it is quite literally putting such views in front of users' eyes."
"The destruction of our planet is not up for debate, but the amplification of climate disinformation risks getting us stuck in inaction and division that delays the urgent policies we need to combat the climate crisis," said Rosner. "People whose homes have been destroyed by wildfires and floods need no reminding of just how real the climate crisis is."
According to Rosner, "The climate crisis is increasingly becoming the new culture war, with many of the same individuals who for years have sought to stoke division and polarize opinion now viewing climate as the latest front in their efforts."
Researchers warned:
While it may only be a small subset of users who are engaging with climate change conspiracy and denial, recent events point to the fact that even obscure narratives which begin on the corners of social media have the ability to spill over into our physical realities and shift political discourse.
The QAnon movement, which believes that a global liberal elite run child sex rings that only Donald Trump can stop, first came into existence on the fringes of the internet in 2017. Today, in 2022 over 40 candidates that expressed some public support for the QAnon conspiracy are running for U.S. national office. Supporters of the conspiracy also played an important role in the January 6 Capitol insurrection, demonstrating the ability of online narratives to become real-world violence.
They also referred to Facebook whistleblower Frances Haugen, who explained during her testimony to U.K. lawmakers that Facebook's algorithm not only "amplifies divisive, polarizing, extreme content" but that this kind of content "gets hyper-concentrated in 5% of the population."
Facebook's stated desire to connect people with reliable information about the climate crisis and its goal of maximizing engagement "are entirely at odds," Global Witness wrote, stressing that the corporation's executives have "shown that when weighing the health of users against their bottom line they will choose to profit time and again. Self-regulation is not working."
"Governments must step in and legislate against the power of Big Tech to shape our realities in dangerous and divisive ways that threaten to derail progress toward tackling the greatest challenge our planet collectively faces," researchers continued, calling for "algorithmic transparency requirements."
"The European Union must ensure that once implemented, the Digital Services Act requires platform audits and risk assessments that are comprehensive and routinely include climate disinformation," they added. "Governments elsewhere--notably the U.S.--must follow the E.U.'s lead and legislate to regulate Big Tech companies."
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Kenny Stancil
Kenny Stancil is senior researcher at the Revolving Door Project and a former staff writer for Common Dreams.
Despite pledging to fight climate disinformation, Facebook's algorithm continues to amplify deceptive and misleading content--pushing users who express doubts about anthropogenic climate change toward outright denial, according to a new report released this week by Global Witness.
"Our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality."
As the United Kingdom-based human rights group pointed out, Facebook's profit-maximizing business model seeks to increase the amount of time people spend "scrolling, liking, and sharing content" because the longer a person stays on the platform, "the more ads they can show that user, thus generating more revenue."
"But it's not just the ads that are tailored to your profile," wrote Global Witness. "Based on your likes, characteristics, behaviors, and demographics, Facebook's algorithm will serve you content that it thinks you will enjoy, (to keep you on the platform) and thus the circle continues."
Facebook announced last year that it would begin attaching "informational labels" to some posts about climate change, directing users to the platform's new "Climate Science Center," which it says connects people "with science-based news, approachable information, and actionable resources" from leading environmental organizations.
To test the efficacy of Facebook's stated commitment to curbing climate disinformation, researchers created two users--"Jane," a so-called climate skeptic, and "John," who accepts the consensus view, held by over 99.9% of the world's scientists, that human activity is propelling contemporary climate change--and observed the quality of information they saw regarding the life-threatening reality of the fossil fuel-driven planetary emergency.
"When we simulated the experience of a climate skeptic user on the platform, within a few clicks Facebook's algorithm recommended content that denied the existence of man-made climate warming and attacked measures aimed at mitigating the climate crisis," wrote Global Witness. "Much of this content deployed culture war tactics to polarize debate around climate change and demonize environmental movements."
After creating a brand-new Facebook account for Jane, researchers had the climate skeptic's profile "like" the Global Warming Policy Forum, which recently changed its name to Net Zero Watch, a page that belongs to a U.K.-based science-denying organization with ties to Steve Baker, a right-wing Tory lawmaker.
Right away, Facebook's algorithm presented Jane with a slew of recommended pages it thought she would enjoy. Global Witness looked at the first three climate-related pages and according to the group--which graded the first nine posts on each page using a classification system it developed to categorize different types of climate content--all but one was "dedicated to climate disinformation."
After Jane "liked" Facebook's first recommendation, a page called Climate Depot--a U.S.-based outlet run by Marc Morano, the communications director of a conservative nonprofit that calls human-caused climate change a "myth"--she was given another "menu of recommendations that Facebook's algorithm thought, based on her existing likes, she would find engaging."
"Of the 27 pieces of content that we graded across the first three recommended climate pages, 100% of it was climate disinformation that either fell into 'distract and delay' or denial categories," wrote researchers. They added:
Eager to see if this was just a one-off, we replicated this test with the same account to see what kinds of climate content would be amplified to Jane, whose online behavior thus far demonstrated an interest in climate disinformation. Using two different anti-climate pages as our starting points, we followed the pages recommended to Jane to understand at what point, if at all, Facebook's algorithm would intercept the bad information with reputable climate science information.
In total we traced Jane's trajectory from three starter pages leading to 18 recommended pages and graded 189 pieces of content. The results?
Of the 18 pages recommended to Jane, only one did not contain any climate disinformation. Twelve of them only included climate disinformation. Of the content we analyzed, only 22% of climate disinformation posts possessed a Climate Science Center flag. As a sub-category, only 34% of climate denial content possessed a flag.
Moreover, while Jane was sporadically and infrequently being referred to the Center, she was being actively encouraged to follow and like pages that almost exclusively espoused climate disinformation.
In response to Global Witness' investigation, Facebook said that "for several months after we announced the initial experiment of informational labels in the U.K., we did not completely roll out our labeling program."
Echoing previous studies of Facebook's massive failure to flag and reduce the spread of climate lies after promising to do so, Global Witness found that the social media giant continues to recommend pages with content promoting the following views:
- The climate crisis is a hoax;
- Rising temperatures are part of natural cycles;
- Environmentalists are alarmists;
- Climate scientists are biased;
- Warming models are inaccurate; and
- Mitigation solutions won't work or are otherwise bad for society.
"This disinformation has consequences," wrote Global Witness. "Research shows that climate disinformation is a primary contributor to public polarization over the climate crisis and that it shapes public attitudes toward climate science. Individuals who are exposed to this kind of disinformation are less likely to support mitigation policies, hindering the ability of policymakers to take meaningful climate action."
In sharp contrast to Jane, experimental Facebook user John "liked" the Intergovernmental Panel on Climate Change's page, after which "every page recommended to John by the algorithm encouraged him to engage with more reliable climate science content," researchers noted.
"Self-regulation is not working."
"The split-screen realities between Jane and John's experience on the very same platform shows the radicalizing effect of Big Tech," they continued. "Facebook's algorithm is ultimately ensuring that the people most in need of good information are the ones least likely to get it."
Jane, an on-the-fence climate skeptic, they added, "was directed to worse information, so that what began on a page full of distract and delay narratives, ended on pages espousing outright climate denial and conspiracy."
Some of the content Jane saw described the United Nations as an "authoritarian regime" that has less credibility than "Bugs Bunny" and accused the "green movement" of "enslaving humanity."
"Facebook has repeatedly said it wants to combat climate disinformation on its platform, but our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality," Mai Rosner, Digital Threats to Democracy campaigner at Global Witness, said in a statement. "Facebook is not just a neutral online space where climate disinformation exists; it is quite literally putting such views in front of users' eyes."
"The destruction of our planet is not up for debate, but the amplification of climate disinformation risks getting us stuck in inaction and division that delays the urgent policies we need to combat the climate crisis," said Rosner. "People whose homes have been destroyed by wildfires and floods need no reminding of just how real the climate crisis is."
According to Rosner, "The climate crisis is increasingly becoming the new culture war, with many of the same individuals who for years have sought to stoke division and polarize opinion now viewing climate as the latest front in their efforts."
Researchers warned:
While it may only be a small subset of users who are engaging with climate change conspiracy and denial, recent events point to the fact that even obscure narratives which begin on the corners of social media have the ability to spill over into our physical realities and shift political discourse.
The QAnon movement, which believes that a global liberal elite run child sex rings that only Donald Trump can stop, first came into existence on the fringes of the internet in 2017. Today, in 2022 over 40 candidates that expressed some public support for the QAnon conspiracy are running for U.S. national office. Supporters of the conspiracy also played an important role in the January 6 Capitol insurrection, demonstrating the ability of online narratives to become real-world violence.
They also referred to Facebook whistleblower Frances Haugen, who explained during her testimony to U.K. lawmakers that Facebook's algorithm not only "amplifies divisive, polarizing, extreme content" but that this kind of content "gets hyper-concentrated in 5% of the population."
Facebook's stated desire to connect people with reliable information about the climate crisis and its goal of maximizing engagement "are entirely at odds," Global Witness wrote, stressing that the corporation's executives have "shown that when weighing the health of users against their bottom line they will choose to profit time and again. Self-regulation is not working."
"Governments must step in and legislate against the power of Big Tech to shape our realities in dangerous and divisive ways that threaten to derail progress toward tackling the greatest challenge our planet collectively faces," researchers continued, calling for "algorithmic transparency requirements."
"The European Union must ensure that once implemented, the Digital Services Act requires platform audits and risk assessments that are comprehensive and routinely include climate disinformation," they added. "Governments elsewhere--notably the U.S.--must follow the E.U.'s lead and legislate to regulate Big Tech companies."
Kenny Stancil
Kenny Stancil is senior researcher at the Revolving Door Project and a former staff writer for Common Dreams.
Despite pledging to fight climate disinformation, Facebook's algorithm continues to amplify deceptive and misleading content--pushing users who express doubts about anthropogenic climate change toward outright denial, according to a new report released this week by Global Witness.
"Our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality."
As the United Kingdom-based human rights group pointed out, Facebook's profit-maximizing business model seeks to increase the amount of time people spend "scrolling, liking, and sharing content" because the longer a person stays on the platform, "the more ads they can show that user, thus generating more revenue."
"But it's not just the ads that are tailored to your profile," wrote Global Witness. "Based on your likes, characteristics, behaviors, and demographics, Facebook's algorithm will serve you content that it thinks you will enjoy, (to keep you on the platform) and thus the circle continues."
Facebook announced last year that it would begin attaching "informational labels" to some posts about climate change, directing users to the platform's new "Climate Science Center," which it says connects people "with science-based news, approachable information, and actionable resources" from leading environmental organizations.
To test the efficacy of Facebook's stated commitment to curbing climate disinformation, researchers created two users--"Jane," a so-called climate skeptic, and "John," who accepts the consensus view, held by over 99.9% of the world's scientists, that human activity is propelling contemporary climate change--and observed the quality of information they saw regarding the life-threatening reality of the fossil fuel-driven planetary emergency.
"When we simulated the experience of a climate skeptic user on the platform, within a few clicks Facebook's algorithm recommended content that denied the existence of man-made climate warming and attacked measures aimed at mitigating the climate crisis," wrote Global Witness. "Much of this content deployed culture war tactics to polarize debate around climate change and demonize environmental movements."
After creating a brand-new Facebook account for Jane, researchers had the climate skeptic's profile "like" the Global Warming Policy Forum, which recently changed its name to Net Zero Watch, a page that belongs to a U.K.-based science-denying organization with ties to Steve Baker, a right-wing Tory lawmaker.
Right away, Facebook's algorithm presented Jane with a slew of recommended pages it thought she would enjoy. Global Witness looked at the first three climate-related pages and according to the group--which graded the first nine posts on each page using a classification system it developed to categorize different types of climate content--all but one was "dedicated to climate disinformation."
After Jane "liked" Facebook's first recommendation, a page called Climate Depot--a U.S.-based outlet run by Marc Morano, the communications director of a conservative nonprofit that calls human-caused climate change a "myth"--she was given another "menu of recommendations that Facebook's algorithm thought, based on her existing likes, she would find engaging."
"Of the 27 pieces of content that we graded across the first three recommended climate pages, 100% of it was climate disinformation that either fell into 'distract and delay' or denial categories," wrote researchers. They added:
Eager to see if this was just a one-off, we replicated this test with the same account to see what kinds of climate content would be amplified to Jane, whose online behavior thus far demonstrated an interest in climate disinformation. Using two different anti-climate pages as our starting points, we followed the pages recommended to Jane to understand at what point, if at all, Facebook's algorithm would intercept the bad information with reputable climate science information.
In total we traced Jane's trajectory from three starter pages leading to 18 recommended pages and graded 189 pieces of content. The results?
Of the 18 pages recommended to Jane, only one did not contain any climate disinformation. Twelve of them only included climate disinformation. Of the content we analyzed, only 22% of climate disinformation posts possessed a Climate Science Center flag. As a sub-category, only 34% of climate denial content possessed a flag.
Moreover, while Jane was sporadically and infrequently being referred to the Center, she was being actively encouraged to follow and like pages that almost exclusively espoused climate disinformation.
In response to Global Witness' investigation, Facebook said that "for several months after we announced the initial experiment of informational labels in the U.K., we did not completely roll out our labeling program."
Echoing previous studies of Facebook's massive failure to flag and reduce the spread of climate lies after promising to do so, Global Witness found that the social media giant continues to recommend pages with content promoting the following views:
- The climate crisis is a hoax;
- Rising temperatures are part of natural cycles;
- Environmentalists are alarmists;
- Climate scientists are biased;
- Warming models are inaccurate; and
- Mitigation solutions won't work or are otherwise bad for society.
"This disinformation has consequences," wrote Global Witness. "Research shows that climate disinformation is a primary contributor to public polarization over the climate crisis and that it shapes public attitudes toward climate science. Individuals who are exposed to this kind of disinformation are less likely to support mitigation policies, hindering the ability of policymakers to take meaningful climate action."
In sharp contrast to Jane, experimental Facebook user John "liked" the Intergovernmental Panel on Climate Change's page, after which "every page recommended to John by the algorithm encouraged him to engage with more reliable climate science content," researchers noted.
"Self-regulation is not working."
"The split-screen realities between Jane and John's experience on the very same platform shows the radicalizing effect of Big Tech," they continued. "Facebook's algorithm is ultimately ensuring that the people most in need of good information are the ones least likely to get it."
Jane, an on-the-fence climate skeptic, they added, "was directed to worse information, so that what began on a page full of distract and delay narratives, ended on pages espousing outright climate denial and conspiracy."
Some of the content Jane saw described the United Nations as an "authoritarian regime" that has less credibility than "Bugs Bunny" and accused the "green movement" of "enslaving humanity."
"Facebook has repeatedly said it wants to combat climate disinformation on its platform, but our investigation shows how worryingly easy it is for its users to be led down a dangerous path that flies in the face of both science and reality," Mai Rosner, Digital Threats to Democracy campaigner at Global Witness, said in a statement. "Facebook is not just a neutral online space where climate disinformation exists; it is quite literally putting such views in front of users' eyes."
"The destruction of our planet is not up for debate, but the amplification of climate disinformation risks getting us stuck in inaction and division that delays the urgent policies we need to combat the climate crisis," said Rosner. "People whose homes have been destroyed by wildfires and floods need no reminding of just how real the climate crisis is."
According to Rosner, "The climate crisis is increasingly becoming the new culture war, with many of the same individuals who for years have sought to stoke division and polarize opinion now viewing climate as the latest front in their efforts."
Researchers warned:
While it may only be a small subset of users who are engaging with climate change conspiracy and denial, recent events point to the fact that even obscure narratives which begin on the corners of social media have the ability to spill over into our physical realities and shift political discourse.
The QAnon movement, which believes that a global liberal elite run child sex rings that only Donald Trump can stop, first came into existence on the fringes of the internet in 2017. Today, in 2022 over 40 candidates that expressed some public support for the QAnon conspiracy are running for U.S. national office. Supporters of the conspiracy also played an important role in the January 6 Capitol insurrection, demonstrating the ability of online narratives to become real-world violence.
They also referred to Facebook whistleblower Frances Haugen, who explained during her testimony to U.K. lawmakers that Facebook's algorithm not only "amplifies divisive, polarizing, extreme content" but that this kind of content "gets hyper-concentrated in 5% of the population."
Facebook's stated desire to connect people with reliable information about the climate crisis and its goal of maximizing engagement "are entirely at odds," Global Witness wrote, stressing that the corporation's executives have "shown that when weighing the health of users against their bottom line they will choose to profit time and again. Self-regulation is not working."
"Governments must step in and legislate against the power of Big Tech to shape our realities in dangerous and divisive ways that threaten to derail progress toward tackling the greatest challenge our planet collectively faces," researchers continued, calling for "algorithmic transparency requirements."
"The European Union must ensure that once implemented, the Digital Services Act requires platform audits and risk assessments that are comprehensive and routinely include climate disinformation," they added. "Governments elsewhere--notably the U.S.--must follow the E.U.'s lead and legislate to regulate Big Tech companies."
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.