New Technology Renews Old Fears of Manipulation and Control

Published on
by
ACLU Blog of Rights

New Technology Renews Old Fears of Manipulation and Control

In the first half of the 20th century, Americans gained a new awareness of the malleability and manipulability of the human mind, and the result was a wave of concern over “propaganda” and other techniques of influence. Today we may be seeing a new wave of similar fears as we begin to wonder whether the ways we use and rely upon technology today are making us susceptible to new, dangerous forms of manipulation.

The first wave, in the 20th century, resulted from a number of factors. These included the discovery of a passionate, irrational unconscious by Freud and Jung, and a reaction against the seemingly mindless march toward slaughter in World War I, both of which fed into a broader disillusionment with the enlightenment rationalism of the 19th century and its faith that humans were ultimately orderly, rational beings. Other factors included the increasingly modernized advertising industry and its surprising success in manipulating consumers, and later the use of propaganda techniques by the fascists and communists in Europe.

The sudden awareness of human vulnerability to manipulation was embraced by some, but also sparked fears that the government would use it to control the beliefs of the population, rather than reflect those beliefs as it should in a democracy. Edward Bernays, considered the “father of public relations,” wrote a highly influential 1928 book entitled Propaganda, in which he argued that human manipulability was a good thing . He wrote,

The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of.

Bernays had the unrestrained faith in expertise and government that was characteristic of the era’s Progressives, but many were not so sanguine. In the first decade of the 20th century, fierce controversy and opposition was sparked by the hiring of press agents by government agencies (first by the Panama Canal Commission and then by the Forest Service and other agencies). In 1913 Congress banned the executive branch from using funds to employ “any publicity expert.” Later that decade Congress also enacted the Anti-Lobbying Act of 1919, which barred agencies from using funds “intended or designed to influence in any manner a Member of Congress to favor or oppose, by vote or otherwise, any legislation or appropriation by Congress.”

These acts were largely unsuccessful. During World War I, the government created a Committee on Public Information, an agency founded for the explicit purpose of making the U.S. public enthusiastic about entering the war through propaganda techniques. In the 1920s, after public sentiment shifted toward the view that involvement in the war was a mistake, many viewed this agency as part of the problem.

Concern and controversies were still roiling after the Second World War. In 1947, for example, the Pentagon launched a launched a large-scale lobbying and public relations effort on behalf of Truman’s proposal to institute the draft, prompting an investigation into the issue by a House subcommittee, which charged in its report that the War Department and its employees had “gone beyond the limits of their proper duty of providing factual information to the people and the Congress and have engaged in propaganda supported by taxpayers' money to influence legislation now pending before the Congress.”

In 1948, Congress enacted the Smith Mundt Act, which authorized the State Department to work to influence the attitudes and opinions of populations overseas via the Voice of America—but also banned the use of funds “to influence public opinion in the United States.”

A new wave of concern?

Concern over manipulation by government and companies has never really gone away, with fresh controversies emerging periodically, but today we may be seeing a whole new wave of concern—and of reason to be worried. There have been several stories in recent months highlighting ways that today’s technology could be used to manipulate and control. Foremost among them was the uproar over “experimentation” by Facebook, which manipulated the “mood” of posts seen by some users to see if it affected the happiness or sadness of the content posted by those users. An echo of the controversy took place a few weeks later when OKCupid wrote about its own experiments on users.

Not long after the Facebook story broke, Glenn Greenwald reported that the British spy agency GCHQ had developed a suite of methods and tools for manipulating internet content, such as spreading disinformation, manipulating the results of online polls, inflating pageview counts, and amplifying or suppressing content on YouTube.

The Facebook revelation sparked an immense amount of discussion, much of it focused upon things like informed consent, ethical oversight, Institutional Review Boards, and the potential effects on particular people such as depression sufferers (for example see these critical pieces and this defense of Facebook which, though ultimately unpersuasive, does a clear job explaining how Facebook filters content). But the most trenchant analyses looked past the ethics of experimentation to broader questions: what does this incident tell us about the growing power of institutions to manipulate and control individuals?

As Kate Crawford pointed out in the Atlantic,

some truly difficult questions lie in wait: What kinds of accountability should apply to experiments on humans participating on social platforms? Apart from issues of consent and possible harm, what are the power dynamics at work? And whose interests are being served by these studies?

Putting her finger on what I think was the most significant thing about this story, she writes that it gives us a glimpse of “how highly centralized power can be exercised.”

Similarly, Zeynep Tufekci, writes,

these large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams. These tools are new, this power is new and evolving…. I identify this model of control as a Gramscian model of social control: one in which we are effectively micro-nudged into “desired behavior”…. Seduction, rather than fear and coercion are the currency, and as such, they are a lot more effective.

The new wave of consciousness over our potential to be manipulated and controlled may also include the network neutrality issue, which is in great part about such fears.

There are two lessons we could learn from looking back at the earlier history of such fears. One could be that we’ll get over this as we did fears around advertising manipulation, and today’s new concerns will come to seem quaint as do some of the old ones. But a better lesson, I would argue, is that the fears that were identified last century were for the most part entirely legitimate and well-founded, that “techniques of influence” have been abused in many ways—not least by playing a key role in some of the greatest catastrophes of the 20th century—and now we have a new reason to worry and to insist upon checks and balances as our government uses technology in new ways, and as we allow manipulable technologies like Facebook to become ever-more-central to the way we communicate, gather information, and relate to others.

Jay Stanley

Jay Stanley is Senior Policy Analyst with the ACLU’s Speech, Privacy and Technology Project, where he researches, writes and speaks about technology-related privacy and civil liberties issues and their future.  He is the Editor of the ACLU's "Free Future" blog and has authored and co-authored a variety of influential ACLU reports on privacy and technology topics.

Share This Article