Monday, 30 May 2011

Avoiding awkwardness, Stanley Milgram's obedience experiment in perspective, part two

The Milgram experiment (see part one of this article, below) came up with findings that no-one predicted or wanted to face. Ordinary people, when instructed to do so, were quite willing to give excruciatingly painful, possibly lethal, electric shocks to strangers.

The book Milgram wrote about the experiment, Obedience to Authority, was an attempt to explain these findings. He showed, convincingly I think, that one explanation – that the experiment enabled a repressed pleasure in inflicting pain to come to the surface – did not make sense.

But the alternative, the only explanation left, was no less disturbing. On reflection it was probably more so. Most people weren’t well-camouflaged sadists. But they were willing to be transformed into instruments in the hands of others. They could easily commit destructive acts at the behest of a “legitimate authority”. And in doing so lost all sense of responsibility, and felt no guilt.

But there was a chink of light. Although most did what they were told, around a third of people did not. They defied authority and refused to deliver electric shocks.

You can see their disobedience enacted in this series of films of the original experiment:

The conclusion Milgram came to through observing hundreds of people was that while obedience was straight-forward (although unpleasant), disobedience was hard. It involved transforming the feeling that something was wrong, first into dissent, and then into action.

“The psychic cost was considerable” writes Milgram. The price of disobedience was a gnawing sense of faithlessness. Subjects who disobeyed were troubled that they had “disrupted the social order”.

Milgram took pains to list the different stages leading to disobedience – strain, inner doubt, externalisation of doubt, dissent and finally action. Because bringing something out into the open, understanding it, makes it easier to perform.

The Left, anti-capitalism, depends upon the spreading of disobedience. By definition, it involves disrupting the social order. Understanding why disrupting the social order is hard, the deep, psychological pull of conservatism, is essential.

The basic Milgram experiment underwent several variations. But the most effective change in bringing about disobedience was quite simple – other people disobeying.

Two actors, posing as other subjects to shock the victim, were introduced. They refused to continue when told to give electric shocks to the learner when he gave a wrong answer.

No variation was so effective in getting the real subject to disobey the experimenter. Under this condition of peer rebellion, only 10 per cent of subjects went on to deliver the highest electric shock. In the basic experiment, 63 per cent of subjects did.

Part of the reason was ideological. “The peers instil in the subject the idea of defying the experimenter,” says Milgram. “It may not have occurred to some subjects as a possibility.”

With peer rebellion comes the acceptance of rebellion as natural, as something valid because other people are doing it. There is, in a way, a battle between obedience and conformity. Between doing what you are told and doing what other people are doing.

Strange as it may sound, conformity is a significant element in disobedience.

But most subjects did not have option of joining in a mini-rebellion. In the basic experiment, they were isolated and thus more easily turned into instruments of authority.

Becoming an instrument was to be transformed into an amoral state, where the pangs of conscience didn’t affect behaviour. But, ironically, what kept people in that state, (what Milgram calls the agentic state) what stopped people disobeying, were feelings and moral reactions.

Rebellion means disrupting the social order, in a minor way, and, in order to avoid that embarrassing situation, most people went on giving electric shocks.

The wish to avoid awkwardness was so strong that, in many cases, it proved stronger than the feeling that it was wrong to give possibly lethal electric shocks to another person.

Confronting, exposing that awkwardness would mean putting the authority-figure, in this case the experimenter, in a difficult situation, completely undermining him. A great many obedient subjects recoiled at doing that.

“It is a curious thing that a measure of compassion on the part of the subject, an unwillingness to ‘hurt’ the experimenter’s feelings, are part of those binding forces inhibiting disobedience,” says Milgram.

No debate occurred about what was happening, no weighing up of the ethics of what was being proposed. 

“Obedience,” says Milgram “does not take the form of a dramatic confrontation of opposed wills or philosophies but is embedded in a larger atmosphere where social relationships, career aspirations and technical routines set the dominant tone.”

In everyday life hierarchical institutions – companies, bureaucratic agencies – are always protected by a human shield. In order to change or rebel against institutional behaviour, individuals must confront and make life difficult for other people working in that hierarchy. Often people who, quite legitimately, are not responsible for the results of institutional behaviour. This human shield is a great weapon in maintaining the status quo.

What is exploited is a common desire to avoid conflict. But conflict is absolutely essential if progress is to be made. It has to happen.

There is an ongoing contest between the efforts of institutions to submerge conflict in “social relationships, career aspirations and technical routines” and the desire of revolutionary movements to bring that hidden conflict out into the open, and to depersonalise it.

One of the great exponents of bringing deliberately hidden conflict into the fresh air was the American community organiser, Saul Alinsky.

You can hear a radio interview with Alinsky, who died in 1972, here:

Alinsky consciously rejected the idea that the main point of life was “being liked and not offending others”. 
He spoke approvingly of wielding power and of engaging in conflict. “Conflict is the essential core of a free and open society,” he said.


There were limitations in what Alinsky was trying to do. His aim was to create “mass power organizations” to confront corporations, governments and public agencies. He was very effective at wringing concessions from institutions like these, but didn’t want to go further. It was as if life should be a never-ending conflict. He deliberately eschewed, wrongly, the idea of gaining economic power within corporations.

But he cut through the superficial politeness that obscures the real conflict underneath. He refused, as he said, “to detour round reality”.

He also understood that people, if they are unorganised, are at the mercy of hierarchical institutions that are remorseless in their pursuit of their aims. He wanted to even up the scales.

Alinsky’s progeny like the English group London Citizens, are equally convinced that without popular political organisation, people will be the victims of powerful institutions. “Only an organised people can control organised money”, they say.

One lesson from Milgram’s experiment is that ordinary people unconsciously permit themselves to be used by institutions for malevolent ends. That what is really going on, the underlying power relations, remain obscured by a patina of politeness, etiquette and moral obligation.

But they can be woken up. At one point, Milgram likens their situation to dozing. But obedience, like sleep, can be disturbed.

Sunday, 22 May 2011

Interlude: All of them Witches, psychopaths in high places


Proof, if it were needed, that the lessons of Stanley Milgram’s obedience experiment have not been heeded comes from an article in the UK Guardian on Saturday – an excerpt from a new book by Jon Ronson, The Psychopath Test.

In the article, Ronson talks to Robert Hare, FBI consultant on psychopaths and author of the 20 item psychopath checklist:

“Serial killers ruin families,” shrugged Hare. “Corporate and political and religious psychopaths ruin economies. They ruin societies.”

Ronson then goes off to interview an American corporate CEO, wondering if he is a psychopath.

Now Hare also featured in the book and movie, The Corporation.  He applied the psychopath checklist to the behaviour of corporations and found a remarkable match-up – superficial charm, grandiose sense of self-worth, lack of empathy, irresponsibility, short-term goals – etc

But you can’t have it both ways. You can’t blame individual psychopaths in high places for “ruining the economy” (presumably he means the financial crisis) and simultaneously say that entities like corporations behave psychopathically. And thus ruin economies.

It may well be that there are an unusual number of psychopaths at the top of corporations. There might be lots of executives who get a kick out of firing workers or destroying rival companies. But it also doesn’t matter. Because there are also lots of non-psychopathic individuals at the top of corporations who don’t enjoy firing workers, but end up doing it anyway.

As The Corporation says, most corporate executives lead morally compartmentalized lives. “The people who run corporations are, for the most part, good people, moral people. They are mothers and fathers, lovers and friends, and upstanding citizens in their communities, and they often have good and sometimes even idealistic intentions.” They don’t, however, change “the corporation’s fundamental institutional nature: its unblinking commitment to its own self-interest”.

There is a parallel in Milgram’s experiment (see previous article). A few people who took part were clearly sadists. They enjoyed giving what they thought were electric shocks and gleefully went on to the end of the board. But most people who gave 450 volt electric shocks were not sadists and took no pleasure in what they did. But they did it just the same. In this context, and those of corporations, feelings don’t matter, actions do. As Milgram said: “Subjective feelings are largely irrelevant to the moral issue at hand so long as they are not transformed into action”.

If we concluded that the top levels of corporations and governments were honeycombed with psychopaths we could, in theory, try to hunt them all down and replace them with better, well-balanced people. But it would be futile. The well-balanced people would either not last five minutes, or else would do what the institutions required them to do.

 In the words of economist Richard Wolffe, "Capitalism systematically organises its key institutions of production - the corporations - in such a way that their boards of directors, in properly performing their assigned tasks, produces crises, then undermine anti-crisis reforms, and thereby reproduce those crises."
 
"Hence," he says, "attention is slowly shifting to questioning the one aspect of capitalism that was never effectively challenged, let alone changed, across the last century and more: the internal organization of corporations."

Psychopaths are not required for economies to be ruined.

The debate over whether the psychopath maketh the corporation or the corporation maketh the psychopath might appear an enjoyable down the pub, chicken and egg question. But it’s more than that. Depending on the answer, we could take a significant step to understanding the state we’re in, or retreat further into the comforting delusions of mind-numbing conservatism.

Thursday, 19 May 2011

Willing Slaves: Stanley Milgram's obedience experiment in perspective. Part one


The Milgram experiment, described as “one of the most controversial experiments in history” was first carried out when John F Kennedy was US President. Milgram’s own book about the experiment, Obedience to Authority, was published before the end of the Vietnam War. Why delve into ancient history?

Partly because its findings about human nature are timeless. In 2009 a BBC recapitulation of the experiment came up with very similar results to those obtained in 1961. 

But, more importantly, the experiments reveal a fundamental misconception about why people behave as they do, a “seriously distorted view of the determinants of human action”, in Milgram’s words. This delusion has, if anything, become more acute.

The idea that we alone control our behaviour and that our personal wants and desires shape the world around us, is a fundamental axiom of everyday ideology. Though the clamour for ethical responsibility is intense, it is also pointless. As the Milgram experiment demonstrates, ordinary people can perform horrific acts precisely because they don’t feel responsible, and don’t control what they do.

The experiments also shed light on why people don’t disobey. It is something that the Left, which is predicated upon disobedience, has to understand if it ever to progress. What Milgram shows is that circumstances by themselves, no matter how terrible, don’t produce disobedience. In fact, some people would rather kill than disobey authority.

It is a dark, negative view of human nature, though Milgram would add, realistic. The Left, by contrast, is usually assumed to believe in human decency. But an understanding of Milgram’s findings leads to inescapably left-wing conclusions.

What did the infamous experiment involve? You can see a film of the BBC replication of the experiment here



Volunteers were recruited to take part in an experiment on learning. But this was just a cover-story. A real experiment was taking place but it was not about learning.

A learner, an actor, was strapped into an “electric chair”. In an adjacent room, the volunteers were sat in front of an electronic shock generating machine with 30 switches going from 15 to 450 volts. The switches were each labelled, going from “slight shock”, “moderate shock” all the way up to “extreme intensity shock”, and the last two just “xxx”. Actually the machine did not deliver any electric shocks though the volunteers believed that it did.

The volunteers were told to shock the learner every time he got an answer wrong. The shocks were to increase in intensity each time. It was prearranged that the learner would get most answers wrong.

The victim’s cries from being shocked were recorded on tape and played so that the volunteer could hear them. They progressed from grunts and groans, to exclamations like “I refuse to go on, “I can’t stand the pain” and an “agonised scream”. Then there was just an ominous silence.

If the volunteer protested, he was told to go by an “experimenter” in a lab coat.

When told about the experiment, the vast majority of people, including psychiatrists, expected people to disobey and refuse to go on. “These subjects see their reactions flowing from empathy, compassion and a sense of justice,” says Milgram. Even people who watched the experiment through one-way mirrors expected disobedience. Only a pathological fringe of sadists would go on shocking, it was thought.

But in most cases – around two-thirds in the basic experiment – the result was obedience. People went on giving electric shocks up to the highest level, even though, in many cases, they feared the victim was dead. When the experiment was replicated in different locations and countries, obedience was even higher. In Munich, for example, 85 per cent obeyed to the final shock on the generator. Empathy and compassion did not win out.

But the point of the experiment was not that most people are sadists just beneath the surface. If the experiment was tweaked so that volunteers were not ordered to shock, but merely told it was fine it they went on to the highest possible electric shock, they invariably didn’t. In fact, the vast majority stopped before the shocks were painful.

 In another variation, there were two experimenters who disagreed. One wanted to go on shocking, the other didn’t. The effect was to stop the experiment dead in its tracks. No-one took advantage of the conflict to give more electric shocks.

Palpably volunteers did not like what they were told to do. When ordered to go on shocking, they protested, they sweated, they shook, they even laughed hysterically at themselves. They were obviously experiencing great stress. But most went on obeying.

“It is the extreme willingness of adults to go to almost any lengths on command of an authority that constitutes the chief finding of the study,” writes Milgram, “and the fact most urgently demanding explanation.”

The book Milgram wrote about the experiment, Obedience to Authority, is full of references to atrocities committed in the Vietnam War and by Nazi Germany. But at one point, Milgram says that to focus just on what the Nazis did is to “miss the point entirely” of the experiments.

In fact, to focus on atrocities or the human propensity to inflict pain on others is to miss the point entirely. What the experiment demonstrates is not just that most people will obey malevolent authorities, but their willingness to obey authorities of any kind, of varying degrees of malevolence. Not just amoral scientists or Generals, but companies and governments.

And it is when Milgram tries to explain the willingness to obey that his book becomes really enlightening.

Milgram says that obedience works when people are in, what appears to be, a paradoxical state of mind. Obedience has to be willingly entered into, but the actions it entails are nothing to do with the personality of the person who carries them out. It is one of the findings of the experiment that motives were irrelevant. Cruel people did not deliver more electric shocks than kind people. Yet, at the same time, nobody put a gun to their head.

Willingness is vital because it creates a sense of moral obligation. “The psychological consequence of voluntary entry is that it creates a sense of commitment and obligation which will subsequently play a part in binding the subject to his role,” says Milgram.

This is why neoliberal thinkers are so intransigent, although completely wrong, on this point. To neoliberals, a person selling themselves on the labour market, is no different to a person selling any type of commodity. It is a voluntary process and so should not be interfered with.

So, to neoliberals, obedience in a job ought to, morally should, follow naturally because the contract has been “voluntarily” entered into. As Milgram says, if obedience is willing, compliance is “easily exacted”.

But if obedience is not willing, compliance depends on direct surveillance. If surveillance ends, obedience stops.

Under voluntary obedience, control comes from within the person. Therefore, there is an internalised basis for obedience, not just an external one.

That is why ideology, or as Milgram phrases it, “the definition of the situation”, is so important. “Control the manner in which a man interprets his world,” he says “and you have gone a long way toward controlling his behaviour.”

 Liberal capitalism was under greatest threat in the nineteenth century when the Left espoused the concept of “wage-slavery” the idea that when a person is compelled under pressure of need, to rent themselves to a company and give away all control over what they do, their position was similar to that of a chattel slave.

Then obedience was powerfully contested and had to be enforced. Now, obedience is, to a large extent, voluntary and the values of liberal capitalism are internalised. We are, as the title of a recent book put it, “willing slaves” 

In the Milgram experiment, the participants quite willingly gave the victim what they believed were excruciatingly painful, possibly fatal, electric shocks. Yet what they did had no relation to what they themselves wanted to do. They voluntarily allowed someone else to dictate what they did.

As Milgram says, “It is the essence of obedience that the action carried out does not correspond to the motives of the actor but is initiated in the motive system of those higher up in the social hierarchy.”

Or as he puts it elsewhere, in hierarchical institutions, “relationship overwhelms content”. When a person merges their unique personality into an organisation, they become something else, a mere vessel, not an autonomous person.

This is remarkably similar to Noam Chomsky’s distinction between people who are moral agents (for good or ill) and “structures of power” that are basically amoral. But this distinction between people and institutions is hard to accept because, as Milgram says, society promotes the ideology that a person’s actions stem from their character. Bad outcomes are the result of bad people.

In the fifty years that have elapsed since the Milgram experiment was first conducted, the ideology of personal responsibility has increased in intensity. It has become an important foundation of the neoliberal ideology now dominant in the UK and US.

As the UK free market think tank, The Institute of Economic Affairs, which played a huge part in intellectually creating Thatcherism, said as the neoliberal revolution was just beginning, "The 'economic system' has no impulse of its own apart from the personal impulses of the individuuals who comprise it." The endless tail-chasing of attributing blame only serves to obscure the problem of the absence of ethical responsibility.

“We” has become the most overused pronoun of all. We are responsible for global poverty, we cause global warming. But “we” aren’t responsible for anything, because “we” don’t exist.

But the effort to make structurally amoral institutions, such as corporations, moral, is not only wrong-headed but malignant because it takes energy and attention from the proper task – institutional change

As Mark Fisher asks in his book Capitalist Realism: “Does anyone really think, for instance, that things would improve if we replaced the whole managerial and banking class with a whole new set of (‘better’) people? Surely, on the contrary, it is evident that the vices are engendered by the structure, and that while the structure remains, all the vices will reproduce themselves.”

But, as Milgram says in Obedience to Authority, for a person to feel responsible for his actions, “he must sense that the behaviour has flowed from ‘the self’. In the situation we have studied, subjects have precisely the opposite view of their actions.”

So far we have only examined the behaviour of those who obeyed authority. But a sizeable minority – around one third of volunteers - defied the instruction to shock the victim and disobeyed.

As we shall see, defiance was no simple matter. But there were conditions that disobedience easier. In Part Two we will consider these.

In Milgram’s words, “The individual is weak in his solitary opposition to authority but the group is strong … this is a lesson that every revolutionary group learns.”