Obedience to Authority, by Stanley Milgram
Obedience to Authority.
by Stanley Milgram.
Harper & Row. 320 pp. $8.95.
Writing in 1963 about Eichmann's trial in Jerusalem, Hannah Arendt invoked the banality of evil. Eichmann was, it seemed to Miss Arendt, a petty bureaucrat, tragically given power over the lives of thousands of helpless people. His crimes were all the more unbearable if they really expressed no particular hostility or aggression. There had been some small comfort in believing that the Nazis were inhuman, as well as inhumane, perhaps because of something amiss in the German psyche. But to think of them as a species of mere bureaucrat was disagreeable in the extreme. People prefer to believe that the evil in the world can be blamed on monsters.
Eichmann may, after all, have been a monster, not just a banal cog in a malevolent machine. The facts of his case remain in dispute. But the dilemma of individual autonomy and responsibility versus social authority transcends Eichmann or Nazi Germany. Also in 1963, a series of experiments by Stanley Milgram, then a young psychologist at Yale, was converging on the broader issue, though from a different direction. Milgram's studies of obedience to authority grew on the empirical branch of social science, rather than the speculative, polemical one. He “ran” subjects (in analogy with rats in mazes) in controlled experiments, then performed statistical tests to check his conclusions. In Obedience to Authority, he summarizes the experiments and the findings, and then interprets them. Because the discipline of data constrains his conclusions, they deserve a credence that Arendt's do not. You may admire Arendt's genius and either believe her or not, but data are to be believed whether or not you like them.
Milgram's experiments are arresting and dramatic, among the most original in all of social psychology. It is worth knowing about them in some detail. The study recruited subjects by placing an ad in a New Haven newspaper seeking “persons . . . for a study of memory.” One hour's participation was to earn $4.00 plus carfare. The ad invited non-academic participants; it specifically excluded students and called for men between the ages of twenty and fifty. Hundreds of subjects, a genuine cross-section of ordinary people in New Haven in the early 1960's, ultimately served in one or another of the several procedures.
Respondents to the ads or to direct mailings would be contacted and an appointment made. When the subject arrived at the elegant Yale laboratory, he found two other men. One was the apparent experimenter, wearing a technician's coat and a suitably stern expression; the other was apparently another respondent to the ad, a pleasant, ordinary-looking middle-aged man. Actually, both men were play-acting in order to create a realistic simulation of social authority. The pseudo-experimenter was chosen and coached to look the part of a self-assured and knowledgeable scientist with the implicit authority of Yale behind him; the pseudo-subject was someone the real subject might easily have identified with, someone apparently recruited, like him, by curiosity plus $4.00.
The subject and the pseudo-subject drew lots to see which would be the teacher and which the learner in the experiment on memory. But the lots were rigged so that the subject always became the teacher. The pseudo-subject, the learner, was then led off, in one version of the procedure, to another room to be strapped into a chair. The pseudo-experimenter explained that the learner was to be taught a list of word associations, which were to be read by the subject. Moreover, explained the pseudo-experimenter, the learner was to be given an electric shock through electrodes whenever he made a mistake.
For the sake of plausibility, the pseudo-experimenter said that the shocks, though painful, “cause no permanent tissue damage.” But the subject was seated before an apparatus bristling with a row of 30 switches labeled “15 volts” to “450 volts” in 15-volt steps. And, to underscore the gradations, the switches were divided into seven zones going from “slight shock” to “danger: severe shock,” and then a few more switches ominously designated “XXX.” The subject was told to advance the intensity of the shock by one step each time the learner erred.
There was, of course, no shock. The learner was learning nothing. He had a predetermined schedule of answers for the learning task, generously larded with mistakes for which he would then get “shocked.” He also went through a prearranged routine in response to being shocked. For example, to 75 volts he grunted. To 120 volts he shouted that the shocks were becoming painful. To 150 volts he cried out to be released from the experiment. From 330 volts to 450 volts, he no longer made any sounds nor did he attempt to answer the word test.
From all reports, and from films of typical sessions, it is virtually certain that the subjects were thoroughly deceived. Each subject actually believed that he and the pseudo-subject had voluntarily come to an experiment on memory, and that sheer chance assigned them to the roles of teacher and learner. Each subject believed that he was inflicting ever-increasing punishment for the learner's errors. But for the draw of lots, the subject believed he would have been taking the shocks instead of dishing them out. The subjects showed signs of intense discomfort; almost all were in visible conflict. Their hands shook; they perspired; they laughed hysterically or pressed on grimly; they protested. In short, the experiment worked. Because it worked, Milgram has been strenuously, at times vituperatively, criticized. Later I will comment on the reaction to the research, but here it seems essential to say that Mil-gram took pains to explain the experiment and its purposes to each subject at the end of every session. An effort was made to prevent any subject from going away feeling that his response was abnormal or shameful. Most, if not all, of the subjects seemed satisfied, even appreciative. Many of them felt that the experience was worthwhile and educational. Whatever you decide about the morality of doing this kind of research, you should know that Milgram seems sensitive to the moral issue and to his responsibility to his subjects.
The main result of the study was that a large fraction of the subjects went to the top of the range of shocks, even when the learner was pleading to be released. Cries that he feared a heart attack did not spare the learner. Subjects would often express reluctance about going on, but then the authority-figure, the pseudo-experimenter, would deliver a prearranged line. He would say, “The experiment requires that you continue.” To a very balky subject, he might say, “You have no other choice, you must go on.” The subject could have refused the command, and some did, but often the experimenter's stern insistence would get the subject back to work, sometimes blaming the learner for his stupidity in not performing better.
The average level of obedience depended on how close, physically, the subject was to the learner. If the learner was in the next room and only his increasingly frenzied pounding on the wall could be heard, 65 per cent of the subjects reached the top of the scale of shocks without disobeying. The other 35 per cent quit, while telling the experimenter what to do with his experiment, in more or less colorful terms. Defiance was invariably emotional. If the learner sat right next to the subject, and the subject was ordered by the experimenter to help restrain the victim, 30 per cent obeyed. Intermediate degrees of remoteness from the victim produced intermediate degrees of compliance. Other wrinkles in the basic procedure showed that obedience also depended on how plausible an authority-figure the pseudo-experimenter was, or simply whether or not he was physically present.
It is all too easy now to try to minimize the value of these findings by saying that they are obvious. Milgram protected himself against that maneuver by asking various groups of people to guess the outcome of the procedure in advance. “What would you do if you were a subject,” he asked. And what would other people do? Every single person asked the questions confidently asserted that he, himself, would defy the experimenter, and was only slightly more realistic about the potential for evil in his fellow man. A group of psychiatrists estimated that a bit more than one person in a thousand would get to the top of the scale, the lunatic or psychopathic fringe. Where else are we to find psychiatrists off by a ratio of almost 650 to 1, for Milgram found 65 per cent obedience, not .1 per cent?
How could people, even experts, be so wrong? One cannot help wondering about the scope of our ignorance. Is it only regarding this specific subject that we live in a Pollyanna pipe dream? Perhaps so, but suppose not. Suppose that, instead, the undue optimism about ourselves is characteristic of a broader set of topics than obedience to authority. While it goes beyond anything in Milgram's book to say so, it is possible that the surprise in it is yet one more rude shock we will endure as the perfectionist dogma about human society gives way to humbler, more likely correct, expectations.
No doubt about it, these experiments were a surprise, and a nasty surprise at that. The essence of a major discovery is its capacity to cause a large shift in our beliefs about some part of the world. Milgram's data show us something about the human world that we had failed to grasp before. They show us, not precisely that people are callous, but that they can slip into a frame of mind in which their actions are not entirely their own. Psychologically, it is not they alone who are flipping the switches, but also the institutional authority—the austere scientist in the laboratory coat. The authority is taken to have the right to do what he is doing, by virtue of knowledge or status. Permutations of the basic procedure made it clear that the subjects' obedience depended on a sense of passivity, and that disobedience resulted if the subject was made to feel as if he were acting on his own initiative. Ordinary people will, in fact, not easily engage in brutality on their own. But they will apparently do so if someone else is in charge.
The experiments prove decisively that ordinary people can turn into lethal instruments in the hands of an unscrupulous authority. The subjects who obeyed did not appear to be in any way atypical; they were not stupid, maladjusted, psychopathic, or immoral in usual terms. They simply did not apply the usual standards of humanity to their own conduct. Or, rather, the usual standards gave way to a more pressing imperative, the command of authority. The brutality latent in these ordinary people—in all of us—may have little to do with aggression or hostility. In fact, hostility was here the humane impulse, for when it turned on the pseudo-experimenter it was a, source of disobedience. In Milgram's procedure, and in the many natural settings it more or less mimics, brutality is the awful corollary of things we rightly prize in human society, like good citizenship or personal loyalty.
Milgram's work is said by some to show how close our society has come to Nazi Germany. But does it really? In Italy, Australia, South Africa, and contemporary Germany, subjects in comparable experiments have been more, not less, obedient. No experiment any place has yet produced a negative result to boast about. In the totalitarian countries—from Spain to China—experiments like Milgram's have not been done for they would be considered subversive, as they would indeed be. But just picture how people would behave in Spain or Albania or China, where obedience is taken far more seriously than in permissive, turbulent America. Ironically, we live in a society where disobedience, not obedience, is in vogue, contrary to the fashionable rhetoric of journalists and social commentators. Still, Milgram's American subjects mostly obeyed, and would probably do so even today, ten tumultuous years later.
The parallels to Nazi Germany, then, really say something about the quality of the authority rather than the obedience to it. A degree of obedience is the given in human society; enough of it to turn dangerous if the wrong people wield it. The political problem is how to decide who shall be the authority, for it is futile if not dangerous to hope for a society of disobeyers. Consider a contemporary case in point. Federal Judge Gesell recently accused some lawyers of a “Nuremberg mentality.” They had defended their client, Egil Krogh, on the grounds that he was obeying the orders of the President, his Commander-in-Chief, when he lied under oath. The judge's view was that Krogh may indeed have been obeying orders, but he should have disobeyed. At other times, people honor their loyal and obedient citizens instead of imprisoning them, and I suspect that Judge Gesell is no different. The Judge saw it the way he did because a bitter alienation of many people from our government has fostered the illusion that obedience to authority is itself malevolent.
The illusion is palpably false, though the authority may, alas, be malevolent. There is a crucial dilemma here, one that will plague any political scheme that values both social order and individual autonomy. But the horns of the dilemma have never been so clear as they are in the light of Milgram's experiments. On one side, we find that even permissive, individualistic America creates people who can become agents of terror. As the weapons of terror become more powerful and more remote from their victims, the dangers of obedience grow. We know that bombardiers in military aircraft suffer little of the conflict and anxiety shown by Milgram's subjects, for they inflict punishment at an even greater distance and they serve an authority with greater license. That horn of the dilemma is much in the news these days. But the other horn is the penalty if we set too high a value on individual conscience and autonomy. The alternative to authority and obedience is anarchy, and history teaches that that is the surest way to chaos and, ultimately, tyranny.
Though he recognizes the alternatives, Milgram's sympathies are libertarian. He wants a more defiant citizenry, a higher percentage of disobeyers to authority. I have no doubt that it would be easy to make people more likely to say no to authority, simply by reducing the penalties for doing so. But the evidence does not suggest that people use only benevolence or moral sensitivity as the criteria for rejecting authority. Think of some real examples. Would it be greed or a higher virtue that would be the main reason for defaulting on taxes if the penalties were reduced? What deserter from the army would fail to claim it was conscience, not cowardice, once conscientious desertion became permissible? Milgram, and no doubt others, would probably answer that reducing the penalties is not enough—that people need to be taught virtue, not just relieved of the hazards of vice. That is fine, but it does not seem like cynicism to insist that the burden of proof falls on those who think they know a way to make people better than they have ever been. I find no proof in this book, or in the contemporary literature of civil disobedience. Milgram's work, brilliant as it is, resolves no dilemmas.
Psychology does not often spawn a finding that is neither trivial, obvious, nor false. Milgram's is the rare exception. The research is well conceived and done with care and skill, even elegance. It was both unexpected and timely, which are virtues that add up to far more than the sum of the parts. Why, then, has the work produced the poles of response? It won Milgram professional recognition and numerous honors, and it was also attacked again and again in the technical literature. The book was reviewed on the front page of the New York Times Book Review, an uncommon distinction for an academic work in social science, but the review was a hatchet job by a professor of literature whose distaste for social science was the main message.
Many people, besides the Times reviewer, do not like social science. There are so many of them that I can even sort them into categories: those who dislike it because they believe it tells us nothing they did not know and those who dislike it because it tells us something they did not want to hear. Milgram's work arouses those in the latter category, who typically insist that they belong in the former category. It is one thing to contemplate the banality of evil in the abstract, but something else to learn that the spore will germinate in New Haven at the prompting of a man in a laboratory coat. The gross discrepancy between what people predicted for the experiment and what others did as subjects is the tangible proof of the findings' power to inform us about ourselves—about our capacity for cruelty and our ignorance of the capacity. Those who continue to insist that the experiment teaches nothing may be relying on ignorance to solve the awful dilemma of authority. It will resolve nothing, of course, but it is no surprise that Milgram's news has driven some heads into the sand.
But that is not the only problem with Milgram's work. Some people, often social scientists themselves, object to the element of deception, especially when it is calculated to produce acute discomfort. This seems to me a valid concern, a secondary dilemma arising from the fact of the research itself rather than from its findings. To learn how people behave under duress or danger, the researcher dissembles, for he cannot subject people to real-life hazards. If there is to be experimentation on people in social settings, there is therefore likely to be deception and manipulation. It is an unpleasant prospect, and easy to reject. But, then, consider Milgram's experiments. Deception and manipulation led to a remarkable addition to our knowledge of the perils of authority. Knowledge like that comes hard and slow. Can we afford to prohibit further discoveries of that caliber and relevance?
Some people answer the question with a dogmatic yes, setting the highest priority on individual privacy at the risk of continuing ignorance. That happens not to be my view. I value privacy but worry about ignorance. A small, temporary loss of a few people's comfort and privacy seems a bearable price for a large reduction in ignorance, but I can see, as can Milgram, how delicate a judgment this implies. Even so, I hope there are other experiments like Milgram's coming along—experiments that will teach us about ourselves, with no more than the minimum necessary deception and discomfort, elegantly and economically conducted. It should not be easy to do experiments like Milgram's—for they should not be done casually—but it should be possible, and, needless to say, the experimenter should not be held in contempt if the outcome is unexpected or uncomfortable. The goal of science is news, not good news.