SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
(Photo: Flickr / cc / mkhmarketing)
New details surrounding how Facebook allowed academic researchers to conduct a secret experiment on nearly 700,000 of its users to determine if digital manipulation of their emotions could be achieved has spurred widespread condemnation and new fears about the power of such systems when turned against the millions of people who use them on a daily basis.
The experiment in question, which sought to document evidence of a "massive-scale emotional contagion through social networks," was authorized by Facebook in 2012 and conducted with outside assistance by researchers at Cornell and the University of California.
As the Wall Street Journal describes it, the purpose of the live experiment was to "determine whether it could alter the emotional state of [Facebook] users and prompt them to post either more positive or negative content." To achieve this, the site secretly and without the knowledge of those being subjected to the research "enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users."
According to the abstract of the study:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others' positive experiences constitutes a positive experience for people.
Though the research was first published in an academic journal earlier this year, new attention was brought to it over the weekend after the New Scientist reviewed its findings and reported that the real-life emotions of people, "like [computer] viruses, can spread through online social networks."
And blogger Sophie Weiner, writing for AnimalNewYork, responded: "What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but to actually change our emotions."
Numerous outlets then picked up the story, with the Guardian reporting that the lawyers, internet activists and politicians it spoke with used words like "scandalous", "spooky" and "disturbing" to describe the mass experiment in emotional manipulation.
According to the New York Times:
Although academic protocols generally call for getting people's consent before psychological research is conducted on them, Facebook didn't ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company's research as a condition of using the service.
But the social network's manipulation of its users' feelings without their knowledge stirred up its own negative reaction. Some Facebook users and critics suggested that the company had crossed an ethical boundary.
As the Guardian reported, online "commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues."
The newspaper quoted Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama's online campaign for the presidency in 2008, who said: "The Facebook 'transmission of anger' experiment is terrifying."
As Johnson wrote on Twitter: "Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?"
Political revenge. Mass deportations. Project 2025. Unfathomable corruption. Attacks on Social Security, Medicare, and Medicaid. Pardons for insurrectionists. An all-out assault on democracy. Republicans in Congress are scrambling to give Trump broad new powers to strip the tax-exempt status of any nonprofit he doesn’t like by declaring it a “terrorist-supporting organization.” Trump has already begun filing lawsuits against news outlets that criticize him. At Common Dreams, we won’t back down, but we must get ready for whatever Trump and his thugs throw at us. As a people-powered nonprofit news outlet, we cover issues the corporate media never will, but we can only continue with our readers’ support. By donating today, please help us fight the dangers of a second Trump presidency. |
New details surrounding how Facebook allowed academic researchers to conduct a secret experiment on nearly 700,000 of its users to determine if digital manipulation of their emotions could be achieved has spurred widespread condemnation and new fears about the power of such systems when turned against the millions of people who use them on a daily basis.
The experiment in question, which sought to document evidence of a "massive-scale emotional contagion through social networks," was authorized by Facebook in 2012 and conducted with outside assistance by researchers at Cornell and the University of California.
As the Wall Street Journal describes it, the purpose of the live experiment was to "determine whether it could alter the emotional state of [Facebook] users and prompt them to post either more positive or negative content." To achieve this, the site secretly and without the knowledge of those being subjected to the research "enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users."
According to the abstract of the study:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others' positive experiences constitutes a positive experience for people.
Though the research was first published in an academic journal earlier this year, new attention was brought to it over the weekend after the New Scientist reviewed its findings and reported that the real-life emotions of people, "like [computer] viruses, can spread through online social networks."
And blogger Sophie Weiner, writing for AnimalNewYork, responded: "What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but to actually change our emotions."
Numerous outlets then picked up the story, with the Guardian reporting that the lawyers, internet activists and politicians it spoke with used words like "scandalous", "spooky" and "disturbing" to describe the mass experiment in emotional manipulation.
According to the New York Times:
Although academic protocols generally call for getting people's consent before psychological research is conducted on them, Facebook didn't ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company's research as a condition of using the service.
But the social network's manipulation of its users' feelings without their knowledge stirred up its own negative reaction. Some Facebook users and critics suggested that the company had crossed an ethical boundary.
As the Guardian reported, online "commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues."
The newspaper quoted Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama's online campaign for the presidency in 2008, who said: "The Facebook 'transmission of anger' experiment is terrifying."
As Johnson wrote on Twitter: "Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?"
New details surrounding how Facebook allowed academic researchers to conduct a secret experiment on nearly 700,000 of its users to determine if digital manipulation of their emotions could be achieved has spurred widespread condemnation and new fears about the power of such systems when turned against the millions of people who use them on a daily basis.
The experiment in question, which sought to document evidence of a "massive-scale emotional contagion through social networks," was authorized by Facebook in 2012 and conducted with outside assistance by researchers at Cornell and the University of California.
As the Wall Street Journal describes it, the purpose of the live experiment was to "determine whether it could alter the emotional state of [Facebook] users and prompt them to post either more positive or negative content." To achieve this, the site secretly and without the knowledge of those being subjected to the research "enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users."
According to the abstract of the study:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others' positive experiences constitutes a positive experience for people.
Though the research was first published in an academic journal earlier this year, new attention was brought to it over the weekend after the New Scientist reviewed its findings and reported that the real-life emotions of people, "like [computer] viruses, can spread through online social networks."
And blogger Sophie Weiner, writing for AnimalNewYork, responded: "What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but to actually change our emotions."
Numerous outlets then picked up the story, with the Guardian reporting that the lawyers, internet activists and politicians it spoke with used words like "scandalous", "spooky" and "disturbing" to describe the mass experiment in emotional manipulation.
According to the New York Times:
Although academic protocols generally call for getting people's consent before psychological research is conducted on them, Facebook didn't ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company's research as a condition of using the service.
But the social network's manipulation of its users' feelings without their knowledge stirred up its own negative reaction. Some Facebook users and critics suggested that the company had crossed an ethical boundary.
As the Guardian reported, online "commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues."
The newspaper quoted Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama's online campaign for the presidency in 2008, who said: "The Facebook 'transmission of anger' experiment is terrifying."
As Johnson wrote on Twitter: "Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?"