Sep 24, 2020
Following Facebook's Wednesday announcement that it will reject paid political ads that prematurely claim electoral victory before official results have been declared, critics on Thursday described the move as "yet another hollow gesture" that fails to address the larger issue of disinformation being spread through "regular, non-ad posts" on the powerful social media company's profit-making platform.
Amidst growing uncertainty that there will be a clear winner on Election Night due to an expected coronavirus-driven spike in mail-in voting and President Donald Trump's overt effort to sow chaos and doubt, Facebook CEO Mark Zuckerberg said: "It's important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted."
Although Wednesday's decision to ban paid political ads that declare victory for a candidate prior to the finalization of election results represents a departure from the platform's past unwillingness to fact check political ads, critics said the move is "once again far too little and far too late."
According to a new analysis, as CNN reported Wednesday, Facebook has permitted a pro-Republican super PAC headed by former Trump administration officials to "target hundreds of misleading ads about Joe Biden and the U.S. Postal Service to swing-state voters ranging from Florida to Wisconsin in recent weeks."
The "apparent failure to enforce its own platform rules less than two months before Election Day" has resulted in the false ads, some of which remain active, being viewed more than ten million times, CNN noted.
The pro-democracy advocacy group Common Cause stated that "the pledge follows the social media giant's pattern of making incremental concessions when its existing policies cause controversies and public outcry."
Jesse Littlewood, vice president of campaigns at Common Cause, decried Facebook's "refusal to moderate public comment responsibly," which he said poses "a serious threat to our democracy."
Littlewood explained that in its response to the problem of voting and election-related disinformation, "Facebook has repeatedly focused on changing rules for advertising." While advertising is certainly "a vector for disinformation and can be micro-targeted to individuals," Littlewood added, "the threat... extends far beyond just advertising."
By "focusing on advertising," Littlewood said, Facebook "ignores and obscures the much larger problem of 'organic' content--the regular, non-ad posts--that spread disinformation."
"President Trump's personal and campaign accounts have repeatedly spread disinformation to massive audiences without any corrective action being taken by Facebook," he added.
While Facebook previously announced its plan to regulate misleading posts about election results by attaching a label explaining that the contest is ongoing and undecided, that doesn't address the posts made between now and Election Day--nor does it address posts made before.
The platform has already contributed to the spread of dangerous disinformation among members of extremist groups, including organizing what turned out to be fatal responses by armed vigilantes to anti-police violence protests, as Mother Jones correspondent Pema Levy reported Thursday.
\u201cFacebook has pushed users into groups to drive up engagement, creating a radicalizing pipeline that leads to QAnon, militias, and white supremacy. These groups could become hubs for planning violence around the election https://t.co/Bo646pH2jj\u201d— Pema Levy (@Pema Levy) 1600960293
Levy described the potential destructiveness unleashed by Facebook:
Closed groups now exist as a largely unpoliced, private ecosystem, where users can quickly spread content that will largely remain off-the-radar--including from journalists and Facebook's own moderators--who are not inside the spaces. Both private and public groups can become spaces for planning and coordinating violence, which Facebook's moderators and artificial intelligence tools have proven unable--and perhaps unwilling--to stop.
As Littlewood explained, Trump's campaign account on Wednesday posted a video featuring Donald Trump Jr. "spreading disinformation and a conspiracy theory that Trump's political opposition will interfere in the election, in order to recruit an election security 'army.'"
Facebook--which Littlewood characterized as "making money hand over fist" by becoming "a sewer of political disinformation"--has "declined to remove, disable, or otherwise stop" the video from spreading.
Despite the social media company's "disingenuous claims" to be deterring the circulation of inaccurate political information, argued Littlewood, its approach "opens the door for rampant disinformation to be spread."
"Until Facebook takes seriously the problem... coming from organic content--regardless of the source--it is making nothing more than token gestures," he said. "And the company knows it."
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Kenny Stancil
Kenny Stancil is senior researcher at the Revolving Door Project and a former staff writer for Common Dreams.
Following Facebook's Wednesday announcement that it will reject paid political ads that prematurely claim electoral victory before official results have been declared, critics on Thursday described the move as "yet another hollow gesture" that fails to address the larger issue of disinformation being spread through "regular, non-ad posts" on the powerful social media company's profit-making platform.
Amidst growing uncertainty that there will be a clear winner on Election Night due to an expected coronavirus-driven spike in mail-in voting and President Donald Trump's overt effort to sow chaos and doubt, Facebook CEO Mark Zuckerberg said: "It's important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted."
Although Wednesday's decision to ban paid political ads that declare victory for a candidate prior to the finalization of election results represents a departure from the platform's past unwillingness to fact check political ads, critics said the move is "once again far too little and far too late."
According to a new analysis, as CNN reported Wednesday, Facebook has permitted a pro-Republican super PAC headed by former Trump administration officials to "target hundreds of misleading ads about Joe Biden and the U.S. Postal Service to swing-state voters ranging from Florida to Wisconsin in recent weeks."
The "apparent failure to enforce its own platform rules less than two months before Election Day" has resulted in the false ads, some of which remain active, being viewed more than ten million times, CNN noted.
The pro-democracy advocacy group Common Cause stated that "the pledge follows the social media giant's pattern of making incremental concessions when its existing policies cause controversies and public outcry."
Jesse Littlewood, vice president of campaigns at Common Cause, decried Facebook's "refusal to moderate public comment responsibly," which he said poses "a serious threat to our democracy."
Littlewood explained that in its response to the problem of voting and election-related disinformation, "Facebook has repeatedly focused on changing rules for advertising." While advertising is certainly "a vector for disinformation and can be micro-targeted to individuals," Littlewood added, "the threat... extends far beyond just advertising."
By "focusing on advertising," Littlewood said, Facebook "ignores and obscures the much larger problem of 'organic' content--the regular, non-ad posts--that spread disinformation."
"President Trump's personal and campaign accounts have repeatedly spread disinformation to massive audiences without any corrective action being taken by Facebook," he added.
While Facebook previously announced its plan to regulate misleading posts about election results by attaching a label explaining that the contest is ongoing and undecided, that doesn't address the posts made between now and Election Day--nor does it address posts made before.
The platform has already contributed to the spread of dangerous disinformation among members of extremist groups, including organizing what turned out to be fatal responses by armed vigilantes to anti-police violence protests, as Mother Jones correspondent Pema Levy reported Thursday.
\u201cFacebook has pushed users into groups to drive up engagement, creating a radicalizing pipeline that leads to QAnon, militias, and white supremacy. These groups could become hubs for planning violence around the election https://t.co/Bo646pH2jj\u201d— Pema Levy (@Pema Levy) 1600960293
Levy described the potential destructiveness unleashed by Facebook:
Closed groups now exist as a largely unpoliced, private ecosystem, where users can quickly spread content that will largely remain off-the-radar--including from journalists and Facebook's own moderators--who are not inside the spaces. Both private and public groups can become spaces for planning and coordinating violence, which Facebook's moderators and artificial intelligence tools have proven unable--and perhaps unwilling--to stop.
As Littlewood explained, Trump's campaign account on Wednesday posted a video featuring Donald Trump Jr. "spreading disinformation and a conspiracy theory that Trump's political opposition will interfere in the election, in order to recruit an election security 'army.'"
Facebook--which Littlewood characterized as "making money hand over fist" by becoming "a sewer of political disinformation"--has "declined to remove, disable, or otherwise stop" the video from spreading.
Despite the social media company's "disingenuous claims" to be deterring the circulation of inaccurate political information, argued Littlewood, its approach "opens the door for rampant disinformation to be spread."
"Until Facebook takes seriously the problem... coming from organic content--regardless of the source--it is making nothing more than token gestures," he said. "And the company knows it."
Kenny Stancil
Kenny Stancil is senior researcher at the Revolving Door Project and a former staff writer for Common Dreams.
Following Facebook's Wednesday announcement that it will reject paid political ads that prematurely claim electoral victory before official results have been declared, critics on Thursday described the move as "yet another hollow gesture" that fails to address the larger issue of disinformation being spread through "regular, non-ad posts" on the powerful social media company's profit-making platform.
Amidst growing uncertainty that there will be a clear winner on Election Night due to an expected coronavirus-driven spike in mail-in voting and President Donald Trump's overt effort to sow chaos and doubt, Facebook CEO Mark Zuckerberg said: "It's important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted."
Although Wednesday's decision to ban paid political ads that declare victory for a candidate prior to the finalization of election results represents a departure from the platform's past unwillingness to fact check political ads, critics said the move is "once again far too little and far too late."
According to a new analysis, as CNN reported Wednesday, Facebook has permitted a pro-Republican super PAC headed by former Trump administration officials to "target hundreds of misleading ads about Joe Biden and the U.S. Postal Service to swing-state voters ranging from Florida to Wisconsin in recent weeks."
The "apparent failure to enforce its own platform rules less than two months before Election Day" has resulted in the false ads, some of which remain active, being viewed more than ten million times, CNN noted.
The pro-democracy advocacy group Common Cause stated that "the pledge follows the social media giant's pattern of making incremental concessions when its existing policies cause controversies and public outcry."
Jesse Littlewood, vice president of campaigns at Common Cause, decried Facebook's "refusal to moderate public comment responsibly," which he said poses "a serious threat to our democracy."
Littlewood explained that in its response to the problem of voting and election-related disinformation, "Facebook has repeatedly focused on changing rules for advertising." While advertising is certainly "a vector for disinformation and can be micro-targeted to individuals," Littlewood added, "the threat... extends far beyond just advertising."
By "focusing on advertising," Littlewood said, Facebook "ignores and obscures the much larger problem of 'organic' content--the regular, non-ad posts--that spread disinformation."
"President Trump's personal and campaign accounts have repeatedly spread disinformation to massive audiences without any corrective action being taken by Facebook," he added.
While Facebook previously announced its plan to regulate misleading posts about election results by attaching a label explaining that the contest is ongoing and undecided, that doesn't address the posts made between now and Election Day--nor does it address posts made before.
The platform has already contributed to the spread of dangerous disinformation among members of extremist groups, including organizing what turned out to be fatal responses by armed vigilantes to anti-police violence protests, as Mother Jones correspondent Pema Levy reported Thursday.
\u201cFacebook has pushed users into groups to drive up engagement, creating a radicalizing pipeline that leads to QAnon, militias, and white supremacy. These groups could become hubs for planning violence around the election https://t.co/Bo646pH2jj\u201d— Pema Levy (@Pema Levy) 1600960293
Levy described the potential destructiveness unleashed by Facebook:
Closed groups now exist as a largely unpoliced, private ecosystem, where users can quickly spread content that will largely remain off-the-radar--including from journalists and Facebook's own moderators--who are not inside the spaces. Both private and public groups can become spaces for planning and coordinating violence, which Facebook's moderators and artificial intelligence tools have proven unable--and perhaps unwilling--to stop.
As Littlewood explained, Trump's campaign account on Wednesday posted a video featuring Donald Trump Jr. "spreading disinformation and a conspiracy theory that Trump's political opposition will interfere in the election, in order to recruit an election security 'army.'"
Facebook--which Littlewood characterized as "making money hand over fist" by becoming "a sewer of political disinformation"--has "declined to remove, disable, or otherwise stop" the video from spreading.
Despite the social media company's "disingenuous claims" to be deterring the circulation of inaccurate political information, argued Littlewood, its approach "opens the door for rampant disinformation to be spread."
"Until Facebook takes seriously the problem... coming from organic content--regardless of the source--it is making nothing more than token gestures," he said. "And the company knows it."
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.