PARIS – What prompts ordinary people to commit acts of evil?
The question has been debated by philosophers, moralists, historians and scientists for centuries.
One idea that carries much weight today is this: you, me – almost anyone – is capable of carrying out atrocities if ordered to do so.
Commanded by an authoritarian figure, and wishing to conform, we could bulldoze homes, burn books, separate parents from children or even slaughter them, and our much-prized conscience would not as much as flicker.
Called the “banality of evil,” the theory has been proffered as an explanation for why ordinary, educated Germans took part in the Jewish genocide of World War II.
Now psychologists, having reviewed an opinion-shaping experiment carried out more than 50 years ago, are calling for a rethink.
“The more we read and the more data we collect, the less evidence we find to support the banality of evil idea, the notion that participants are simply ‘thoughtless’ or ‘mindless’ zombies who don’t know what they’re doing and just go along for the sake of it,” said Alex Haslam, a professor at the University of Queensland in Australia.
“Our sense is that some form of identification, and hence choice, generally underpins all tyrannical behavior.”
Their detective work focused on legendary experiments conducted in 1961 by Yale University psychologist Stanley Milgram.
Volunteers, told they were taking part in an experiment on learning, were led to believe they were administering an electric shock to a man, dubbed the “learner” who had to memorize pairs of words.
Evil of Eichmann
Every time the learner made a mistake, the “teacher” was told by a stern-faced, lab-coated official to crank up the shock, starting with a mild 15 volts and climaxing at a lethal 450 volts.
The experiment was fake – the learner was an actor and the shocks never happened. The teacher could hear, but not see, the learner.
Frighteningly, in one test, nearly two-thirds of volunteers continued all the way to “lethal” voltage, even when the learner pleaded for mercy, wept or screamed in agony.
These experiments became enshrined in textbooks as an illustration of how the conscience can be put on hold under orders.
The findings meshed with a landmark book by the writer Hannah Arendt on the 1961 trial of Adolf Eichmann, an architect of the Holocaust.
Far from the monster she had expected, Arendt found that Eichmann came across more like a petty bureaucrat, prompting her to coin the term “banality of evil” to suggest how ordinary people, by conforming, could commit atrocities.
The new research, published in the British Journal of Social Psychology, took a closer look at Milgram’s “teachers”.
A team sifted through a box in the Yale archives that contained comments written by the volunteers after they were told the purpose of the experiment, and that the torture had been fake.
Of the 800 participants, 659 submitted a reaction. Some said they had felt unease or distress during the tests, but most reported being positive about the experience, some extremely so.
‘Unconscionable things’
“To be part of such an important experiment can only make one feel good,” said one.
“I feel I have contributed in some small way toward the development of man and his attitudes towards others,” said another.
“If it [is] your belief that these studies will benefit mankind then I say we should have more of them,” said another.
Were these happy comments spurred by relief, after volunteers learned they had not, in fact, hurt anyone?
No, suggests the paper. A sense of pleasure, of duty fulfilled, of having served a higher calling, pervaded the comment cards.
Milgram had also given the volunteers a dose of mission-priming before the experiment. Without saying what it entailed, he told them that what they would do would advance the cause of knowledge.
Participants’ awe of Ivy-League Yale played a role, too – obedience levels were higher there than when experiments were conducted in offices in Bridgeport, Connecticut.
Milgram “was a skilful dramatist as well as a psychologist,” said Kathryn Millard, a professor at Macquarie University, Sydney.
Far from supinely obeying the lab-coated overseer, volunteers escalated the shocks believing they were acting for a noble cause – science, argues the paper.
“The ethical issues here (are) more complex than commonly supposed,” Haslam told AFP by email.
“It is apparent Milgram assuaged participants’ concerns by making them believe in a noxious ideology – namely, that it is acceptable to do otherwise unconscionable things in the cause of science.”
Stephen Reicher, a professor at the University of St Andrews in Scotland, said the implications were far-reaching.
It showed that ordinary people could commit acts of extraordinary harm, but that thoughtlessness was not the main motivator, he said.
“We argue that people are aware of what they are doing, but that they think it is the right thing to do,” he said.
“This comes from identification with a cause – and an acceptance that the authority is a legitimate representative of that cause.”