You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

How Mass Persuasion Works

A new book traces the language of brainwashing to a series of Cold War–era scares.

Sean Gallup/Getty Images

The term “Stockholm syndrome” comes from a 1973 bank heist that started out perfectly normally. An experienced robber named Jan-Erik Olsson entered Kreditbanken in Norrmalmstorg square, Stockholm, wearing a disguise, and demanded money at gunpoint. Police arrived on the scene quickly, and Olsson reacted by taking four bank employees hostage. In return for their freedom, he demanded three million kronor, a getaway vehicle, weapons, and for his friend Clark Olofsson to be released from prison and brought to the scene. The minister of justice granted this last request, hoping to use Oloffson as a go-between for negotiations, but the pair barricaded themselves and the hostages in a bank vault, refusing to come out until their remaining demands were met. Police ended the siege five days later by throwing tear gas into the vault.

Dark Persuasion: A History of Brainwashing from Pavlov to Social Media
by Joel E. Dimsdale
Yale University Press, 304 pp,. $28.00

Olsson and Olofsson showed both brutality and kindness to their hostages in the interim. They kept the captives on leashes, but allowed them to make phone calls. One hostage, Kristin Ehnmark, used hers to antagonize the Swedish prime minister, accusing him of “playing checkers” with her life. To Sweden’s great surprise—the crime was the first to be covered in real time by national television—she urged him to give Olsson what he wanted and get the police off the scene, fearing that they would escalate tensions and lead to someone being killed. The hostage situation was eventually resolved with no fatalities, and the captors and hostages shook hands goodbye. As the robbers walked off in police custody, one of the hostages called out, “Clark, I’ll see you again!”

Joel E. Dimsdale tells this story in Dark Persuasion: A History of Brainwashing from Pavlov to Social Media, a study of the art of getting another person to do what you want. The hostages’ behavior was seen as so unusual that this dynamic—where hostages are said to identify with their captors—was described as its own “syndrome.” Stockholm syndrome has since come to be used for various sorts of paradoxical behavior. A woman who stays with her abusive husband might be said to have Stockholm syndrome, in common parlance, because although she may not literally be held hostage, she makes decisions that keep her in a risky situation and are ultimately counter to her interests.

The problem with the Stockholm syndrome origin story, Dimsdale explains, is that it’s not entirely true. He cites a 1977 study of 77 separate “hostage and barricade situations,” which found that “four times as many hostages died in the crossfire of assault by security forces than were executed by terrorists.” So, Ehnmark’s frustration with the police may not have been paradoxical at all, but straightforward pragmatism. She may have been right to fear the police more than she feared Olofsson and Olsson. That a “syndrome” might have such a confused backstory and yet such sticking power in the popular lexicon shows how much narrative is involved in the way we discuss emotion.

A similar problem afflicts the term “brainwashing,” a concept Dimsdale presents as the fruit of midcentury competition in the field of psychiatric research during the Cold War; a tool exploited by cultists and religious leaders; and ultimately a phenomenon that eludes easy definition while growing ever harder to ignore. We live in a time of mass persuasion, where disinformation flourishes on social media and vaccine skepticism jeopardizes America’s recovery from Covid-19. Lunatic conspiracy theories and fringe ideologies foster very real violence on the reactionary right, while marketers practice the art of convincing people to buy things at scales too vast to visualize. Dark Persuasion suggests that the language of brainwashing is incommensurate with the problems of our time, and a hangover from the era of America’s most paranoid wars.

The term brainwashing was first used in 1950, Dimsdale writes, as a translation for the Chinese xǐ năo, meaning “wash brain,” in the florid writing of the journalist and CIA agent Edward Hunter. He was explaining, Dimsdale says, why some American POWs defected after their internment in Korea and China during the Korean War: They had been subject to a form of psychological torture so intense that it destroyed their reason. “In brainwashing,” Hunter wrote, “a fog settles over the patient’s mind until he loses touch with reality. Brainwashing is something new which is contrary to human nature and inseparable from communism.”

He was describing a method of coercive persuasion that China used on prisoners of war, a combination of indoctrination and what Dimsdale calls “DDD”: debility, through injury or deprivation of sleep or food or similar; dependency, where the prisoner is returned to an infantile state, reliant on captors for everything; and dread, which is self-explanatory. It’s hard to separate out the truth of what was actually going on from the speculations of American authorities, but to his credit Dimsdale doesn’t really try. Instead, he posits the Korean War as the birthplace of brainwashing—a war that did not actually see a higher-than-usual number of defectors, Dimsdale notes, but that inspired an era of intense, often unethical research into the way the human mind behaves under stress.

Whether the American defectors of the Korean War were tortured, persuaded, or otherwise, the theory of brainwashing suggested that POWs had been coerced into defecting, rather than doing so through the use of their reason. The episode and its explanation opened up frightening new possibilities for the U.S.: How could leaders trust their people to stay loyal in the face of such techniques? But also: How might they use such techniques to their own advantage?

The CIA pursued its own quests for mind control—looking for a truth serum, or a way to selectively delete somebody’s memories—through the MKUltra project, a covert operation run through the 1950s and ’60s by the United States Army Biological Warfare Laboratories. The project channeled money to scientists to pursue experiments in sensory deprivation using high doses of drugs, including mescaline, barbiturates, and LSD, as well as sleep manipulation on human subjects. Many of these studies involved the drugging of unwitting participants, with the aim of seeing whether they would confess their innermost secrets. In 1972 Dr. Robert. G. Heath tried to reprogram a gay man into heterosexuality using electrodes implanted in his brain. Researcher Ewen Cameron forced patients to listen to tapes of his own voice for hours each day. Hundreds of people were experimented upon in the most paternalistic of ways, and a 1994 inquiry found that MKUltra-funded researchers committed several crimes—all for the sake of keeping up with the Soviets when it came to controlling people’s minds.

All these experiments proved was that it’s quite easy to destroy a mind with fear and lack of sleep; it is, however, impossible to then restore a person’s mind in a controlled way. No matter the extremes psychopharmacologists, behavioral scientists, and neuroscientists went to, the effects of trauma on the mind are simply too unpredictable to turn into a system. The idea, then, of “brainwashing” was always a fantasy, born out of a paranoid and violent era in American history, and it blossomed in parts of the counterculture of the 1970s.

Cult leaders, unlike interrogators, have no need of reliable methods for extracting true information, but they know how to break people. As a longtime resident of San Diego, Dimsdale was stunned to learn of the mass suicide that took place among the Heaven’s Gate movement in 1997. The group was founded in 1974, and members believed that the Comet Hale-Bopp was trailed by a spaceship that would take them to the next life upon their exit from this one; they died in bunkbeds wearing matching sneakers. It was reminiscent of the 1978 mass death at Jonestown, when community leader Jim Jones convinced (and in some cases compelled) hundreds of his followers to commit “revolutionary suicide” at their compound in Guyana.

Contemplating how these leaders cut off their flocks from the outside world, and made them live in cloistered communal conditions, Dimsdale compares these crimes to the capture and conversion of heiress Patricia Hearst by the Symbionese Liberation Army in 1974. She “converted” to her captors’ side, and shot someone at their behest during a robbery. At her trial, Hearst’s lawyers asked whether she could really be accountable for her actions, considering that her captors had kept her in a closet for several weeks, essentially holding her prisoner. The implication of this argument was that personal responsibility is lessened in brainwashed people.

Throughout Dark Persuasion, Dimsdale shows how the idea of brainwashing has been used to justify something much cruder: not the intricate manipulation of another person’s thoughts but simple abuse, whether conducted by the government or by the leaders of a cult. The term always had more political explanatory power than actual psychological basis. If it does have any real meaning, it is as a way of indicating a broader anxiety—about the threat of a hazily understood ideological foe.