Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

screenshot of report cover

New York University published an report Tuesday entitled Disinformation and the 2020 Election: How the Social Media Should Prepare. (Image: screenshot of report)

New Report Warns of Deepfakes and Social Media Disinformation Campaigns That Aim to Undermine 2020 Election

The NYU expert who authored the report says social media companies "have to take responsibility for the way their sites are misused."

Jessica Corbett, staff writer

A report published Tuesday by New York University warns that fake videos and other misleading or false information could be deployed by domestic and foreign sources in efforts influence the U.S. 2020 presidential election campaign and details strategies to combat such disinformation.

"We urge the companies to prioritize false content related to democratic institutions, starting with elections."
—Paul M. Barrett, report author

The report—entitled Disinformation and the 2020 Election: How the Social Media Should Prepare—predicts that for next year's election, so-called "deepfake" videos will be unleashed across the media landscape "to portray candidates saying and doing things they never said or did" and, as a result, "unwitting Americans could be manipulated into participating in real-world rallies and protests."

Deepfakes, as NPR reported Monday, are "computer-created artificial videos or other digital material in which images are combined to create new footage that depicts events that never actually happened." Manipulated videos like those of Democratic House Speaker Nancy Pelosi (Calif.) that spread virally online earlier this year—often called shallowfakes or cheapfakes—also pose a threat to democratic elections, the report says.

In terms of delivery of disinformation, the NYU report spotlights the messaging service WhatsApp and the video-sharing social media network Instagram—which are both owned by Facebook. A report commissioned by the Senate Intelligence Committee in the wake of the 2016 election accused Russia of "taking a page out of the U.S. voter suppression playbook" by using social media platforms including Facebook and Instagram to target African-American audiences to try to influence their opinions on the candidates in that race.

The NYU report predicts that governments such as Russia, China, and Iran may work to disseminate lies in attempts to sway public opinions regarding the next race for the White House, but "domestic disinformation will prove more prevalent than false content from foreign sources." Digital voter suppression, it warns, could "again be one of the main goals of partisan disinformation."

To combat disinformation from all sources, the NYU report offers nine recommendations for major social media companies:

  • Detect and remove deepfake videos: Realistic but fraudulent videos have the potential to undermine political candidates and exacerbate voter cynicism.
  • Remove provably false content in general: The platforms already remove hate speech, voter suppression, and other categories of content; we recommend that they add one more.
  • Hire a senior content overseer: Each company needs an executive with clout to supervise the process of guarding against disinformation.
  • Attend to Instagram's disinformation problem: The photo-posting platform needs the concerted attention of its parent, Facebook.
  • Limit the reach of WhatsApp: Users should be restricted to forwarding content to one chat group at a time.
  • Defend against for-profit disinformation: The companies must prepare for false content generated by hired-gun firms.
  • Back legislation regulating political ads and punishing voter suppression: Narrowly tailored bills pending on Congress could help curb some forms of disruption.
  • Improve industry-wide collaboration on disinformation: For example, when one platform takes down abusive accounts, others should do he same with affiliated accounts.
  • Teach social media literacy in a more direct, sustained way: Users have to take responsibility for recognizing false content, but they need more help to do it.

Paul M. Barrett, the report's author and deputy director of the NYU Stern Center for Business and Human Rights, told The Washington Post that social media companies "have to take responsibility for the way their sites are misused."

"We urge the companies to prioritize false content related to democratic institutions, starting with elections," he said. "And we suggest that they retain clearly marked copies of removed material in a publicly accessible, searchable archive, where false content can be studied by scholars and others, but not shared or retweeted."

While the removal of disinformation by social media giants is touted as a positive strategy by Barrett and others, such calls have sparked censorship concerns, especially as online platforms such as Facebook and YouTube have recently blocked content or shut down accounts that spread accurate information.

Michael Posner, director of NYU's Stern Center, said in a statement to The Hill that "taking steps to combat disinformation isn't just the right thing to do, it's in the social media companies' best interests as well." As he put it, "Fighting disinformation ultimately can help restore their damaged brand reputations and slows demands for governmental content regulation, which creates problems relating to free speech."

One example is what critics of the Trump administration have dubbed the 'Censor the Internet' executive order, which would give federal agencies certain powers to decide what internet material is acceptable. After a draft of that order leaked, as Common Dreams reported last month, "free speech and online advocacy groups raised alarm about the troubling and far-reaching implications of the Trump plan if it was put into effect by executive decree."

A Bloomberg report from Saturday which revealed that "fake news and social media posts are such a threat to U.S. security that the Defense Department is launching a project to repel 'large-scale, automated disinformation attacks'" with custom software sparked additional concerns about potential consequences of U.S. government actions to combat disinformation.

The NYU report does highlight the potential for legislation—particularly the Honest Ads Act, a bipartisan bill reintroduced earlier this year that aims to improve transparency around who is paying for political ads. However, Barrett told The Hill that he doesn't believe the bill "has much of a chance" of passing because of some Senate Republicans' positions on election security legislation.

"Congress would be making a huge contribution if there were hearings, particularly if there were bipartisan hearings... that educate people as to where we've been and what's likely to come," Barrett added. "We need more digital literacy, and Congress could use its position to provide that."

The NYU report adds to mounting concerns among tech experts, politicians, and voters about how disinformation could sway the 2020 election. John Feffer, director of Foreign Policy In Focus at the Institute for Policy Studies, wrote in June about the potential impact of deepfakes on next year's race:

Forget about October surprises. In this age of rapid dissemination of information, the most effective surprises happen in November, just before Election Day. In 2020, the election will take place on November 3. The video drops on November 2. The damage is done before damage control can even begin.

Feffer added that artificial intelligence (AI) systems which are "designed to root out such deepfake videos can't keep up with the evil geniuses that are employing other AI programs to produce them."

"It's an arms race," he wrote. "And the bad guys are winning."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
We need your help.

Support progressive journalism.

Common Dreams is not your average news site. We don't survive on clicks or advertising dollars. We rely entirely on your support. And without it, our independent progressive journalism simply wouldn’t exist. Every gift of every amount matters.

Join the fight and support our common dreams today.

NYC Mayoral Candidate Eric Adams Rebuked for 'Dangerous' 400-Students-to-1-Teacher Theory

"Defunding schools to the point that we have a 400:1 student-to-teacher ratio so we can bankroll another huge expansion in an already-multibillion dollar police budget," said Rep. Ocasio-Cortez, "is how you build a school-to-prison pipeline."

Jon Queally, staff writer ·

Lack of Patent Waiver Would Add Over $70 Billion to Cost of Vaccinating World: Oxfam

Most of that money, said a spokesperson for the group, "will go directly into the pockets" of Big Pharma shareholders.

Jon Queally, staff writer ·

Darnella Frazier Receives Pulitzer Special Citation 'for Courageously Recording the Murder of George Floyd'

"Without Darnella, Derek Chauvin never would have been tried and George Floyd would have been blamed by the state for his own death."

Brett Wilkins, staff writer ·

'We Are Sounding the Alarm': Global Left Warns Right-Wing Fujimori Trying to Steal Election in Peru

"Fujimori’s only way forward to victory is simple: annul tens of thousands of already tallied votes," warned Progressive International. "Powerful forces are organizing to deliver this outcome."

Andrea Germanos, staff writer ·

'Climate Denial Masquerading as Bipartisanship': Progressives Reject Latest Infrastructure Compromise

"Refusing to act on climate means selling out future generations. The future of our planet and the health of our communities are non-negotiable."

Jake Johnson, staff writer ·