alf-a-century ago, Philip Zimbardo’s controversial Stanford Prison Experiment and Stanley Milgram’s earlier and infamous obedience experiment demonstrated the importance of context in explaining why people do bad things to each other.
In 1961, Yale’s Milgram recruited ordinary people from all walks of life and instructed them to administer electric shocks of up to 450 volts to test subjects. Why did they comply? Because a Yale scientist in a lab coat told them they had to.
A decade later, Stanford’s Zimbardo enlisted college volunteers for a “prison study” in which students were randomly selected to play the role of guard or prisoner. Within a day or two, the “guards” were psychologically abusing the “prisoners.” The guards weren’t allowed to physically abuse their inmates but they did place them in isolation and deprive them of food and sleep. In submitting to psychological torture, some of the “prisoners” lost track of being part of an experiment and began behaving as if they were actually criminals serving jail time. Worse, psychologist Zimbardo began taking his prison “warden” role seriously. An experiment collapsed into a nightmare.
The two studies showed that a person’s transformation into a bully had as much or more to do with context as individual predisposition. Not all the “guards” in the Zimbardo study were actively abusive. Yet not one of them tried to stop another from tormenting “prisoners.” This lined up with the Milgram study: none of his recruits declined to administer shocks and none balked before reaching 285 volts, well more than the jolt you’d get by sticking your finger in a live socket (there were no actual shocks but the test subjects flicking the switch were unaware of the ruse).
Pondering these experiments, it’s hard not to wonder how we’d behave in similar circumstances.
A few years ago, Zimbardo was in Rome to present his 2007 book “The Lucifer Effect,” in which he studies the transformation of good into evil. He explained how U.S. prison guards at Iraq’s Abu Ghraib prison came to torture prisoners in a situation that resembled an extreme replica of the Stanford study. At Abu Ghraib, guard shifts were extremely long, with the guards receiving no days off and quartered inside the prison itself. Built by the British in the 1950s, Abu Ghraib was wretchedly filthy and inhumane, for both guards and prisoners both, and located in an area under constant bombardment. There were no cameras and little supervision. What eventually happened resembled the worst of the Stanford “prison” in terms of systematic dehumanization, torment, and torture. The result made it easy to blame the guards — the “bad apples” — when in fact they’d been set up (11 U.S. reservists were ultimately court marshaled and discharged in 2005, with two receiving jail terms.)
But how the guards came to behave was entirely predictable given the organization of the prison. Good people can do bad things — “induced, seduced, and initiated into behaving in evil ways,” writes Zimbardo. They can and will set aside morality and better judgment.
Recent studies by social psychologists have helped shed light on the contextual factors behind such behavior. Bad things often happen because nobody intervenes to stop them.
Take Zimbardo himself. He was so captivated by his fantasy role as “warden” that he failed to intervene even while staring at “guard” abuse on closed circuit TV. “Phil, what are you doing!” a graduate assistant shouted at one point. “These are just kids!” Only then did Zimbardo wake from his trance and realize that he’d come to see a fantasy situation as real, as had a lawyer and a priest summoned to mimic an actual prison situation as attorney and chaplain.
What else played a part? Why did no one intercede?
The “guards” wore uniforms and reflective sunglasses to mask their expressions. Subsequent studies have shown that people in disguises are far more inclined to administer stronger torture. They have freer rein when not identified or individualized. (So much for the idea that schools should adopt uniforms to discourage bullying — it would seem quite the opposite.)
In the Stanford study, “prisoners” were both dehumanized and de-individualized. They wore stocking hats, numbered pre-op hospital gowns, and walked in flip-flops. Their numbers were their names, canceling individuality (it’s more difficult to hurt an individual than someone labeled as a cipher).
The “guards” — the fake ones at Stanford and the real ones in Iraq — endured long and boring shifts. Boredom can bring out the worst in people just as power over others can give boredom a potentially dangerous outlet.
Add an educational culture in which children are taught to blindly obey authority and you’re left to ponder a deeply troubling question. How are we to expect that “guards” or similar subalterns won’t obey bad authorities, or capitulate to the wider authority of the group to which they belong?
If we fail to emphasize individuality, how can we expect individuals to hold themselves accountable, or find the personal courage necessary to stand up to or stop a bully or an abuser? “Evil,” says Zimbardo, “is knowing better but willingly doing worse.” Speaking out against the crowd is a form of moral heroism. It is creating the right context for such heroism that remains elusive.