Main Book Contents Background Reflections Building on Breakthrough  

To Err is Human: Nuclear War by Mistake?*

Marianne Frankenhaeuser

Professor and Head of Psychology Division, Karolinska Institute, Stockholm. Author of 200 papers, Dr. Frankenhaeuser has been President of the European Brain and Behaviour Society, and an advisor to government, the World Health Organization, and the Institute of Medicine, National Academy of Sciences.

 


Introduction

The Evolutionary Perspective

Human Failure

Underload and Overload

Performance during Crisis

Decision Making in Groups

Concluding Comments

References

 


 

War has often broken out by mistake - a consequence of misunderstandings and misinterpretations. When misunderstandings can result in mass destruction, it is vitally important to analyze the nature of human fallibility.

 

The Evolutionary Perspective

In order to understand how difficult it can be for people to cope with the demands of the modern world, it is necessary to view human capability and human constraints in an evolutionary perspective.1 Our ancestors evolved into the present species over millions of years, when conditions for survival were entirely different from today. They adapted gradually to an environment which changed very slowly, and it was the slowness of the change that made adaptation possible. Then the rate of change began to increase. The history of humankind tells us that the human species spent 3 million years in the forest, 3,000 years on the fields, 300 years in the factories, and now - barely - 30 years at the computer terminal.

In striking contrast to this accelerating pace of social evolution, the human brain has remained essentially the same over thousands of years. For our ancestors, ability to adapt to heat, cold, and starvation was a prerequisite for survival. Thanks to the body's ingenious mechanisms of adaptation, our ancestors survived the hardships which were part of their everyday existence. Today's demands, while generally more psychological than physical in nature, trigger the same bodily stress responses which served our ancestors by making them "fit for fight." These bodily responses may, of course, be totally inappropriate for coping with the pressures of life today.

Thus, there is nothing in the history of humankind to prepare us for coping with the high-technology environment that we have so rapidly created for ourselves. Neither have we used the new technology to adapt environments to people's abilities and constraints. In fact, we have today a very poor fit between the ancient humans and their modern environment. This poor person-environment fit induces stress and prevents people from functioning at the peak of their ability, thereby increasing the risk for performance errors and irrational decisions.

 

Human Failure

Human errors are often blamed on so-called accident-prone individuals, but there is no one special category of people who commit errors. It happens to all of us, including the well-trained, the highly skilled, and the so-called stress-tolerant people. All of us from time to time make mistakes, such as flashing the wrong signal, taking the wrong turn, or pushing the wrong button. Human beings are inherently nonfoolproof: To err is human.

 

"Fallibility, lack of perfection, is the key characteristic of human behavior and is built into each system that we create."

 

The risk of committing errors increases under emotional stress, and people involved in complex defense systems are commonly exposed to emotionally arousing conditions characterized by high time pressure. Think of people faced with incidents such as nuclear false alarms, accidental nuclear explosions, or unintentional firing of missiles.2 Judgment and decision-making ability could be greatly impaired under such conditions.

In incidents of this kind, technical and human failures tend to interact. However, threats are generally discussed in purely technical terms, with the implication that improved technology would more or less abolish the risk. This is an illusion.

Fallibility, lack of perfection, is the key characteristic of human behavior and is built into each system that we create. Computers make mistakes. They are no more foolproof than the people who constructed them. And most importantly, computers cannot cope with the unpredicted, the unexpected. Computers cannot be programmed for events that cannot be foreseen. When something unforeseeable happens, it takes human initiative to put things right.

 

"This sudden switch from understimulation to overstimulation when something to overstimulation when something goes wrong, combined with emotional pressure, may cause temporary mental paralysis ... The consequences ... may be disastrous because of the narrow time margins."

 

But it also is very difficult for humans to cope with the unexpected, particularly when under severe time pressure. And high-technology defense systems operate with ever narrower time margins. The time that one has for correcting a false alarm has shrunk to a few minutes. And, the more weapons we deploy, the more people will be interacting with them, and the greater will be the likelihood of disaster resulting from human error.

History is full of incidents showing how temporary indisposition or irrational behavior of people in leadership positions has caused catastrophic failures. Much less attention has been paid to the danger of temporary incapacity - due to either fatigue or overexcitement - of all the other people in the chain, who receive and transmit the information on which the leader at the top has to act. There is a considerable risk that messages will be misinterpreted and distorted before reaching the decision maker at the top of the hierarchy.

 

Underload and Overload

Brain research and behavioral research have taught us the conditions under which people perform well and when performance breaks down. The inverted U-curve of Figure 1 illustrates the relation between level of stimulation and performance efficiency. There is a biological basis for this relationship. In order to function adequately, the human brain needs to be fed a moderate amount of impressions from the outside world. If the total inflow to the brain falls below a critical level, disturbances occur in brain function and mental performance deteriorates. Under the opposite condition, when the stimulus flow exceeds a certain level, brain function is likewise disturbed.3

The optimal level of human functioning is located at the midpoint of a scale ranging from sleep to overexcitation. In between these extremes the brain is moderately aroused, we are alert, and perform to the best of our abilities. Mental efficiency declines both when the inflow decreases and when it increases from the optimal point.

An early sign of understimulation is difficulty concentrating, accompanied by feelings of boredom, distress, and loss of initiative. One becomes passive and apathetic. Against this background, consider the demands put on those whose task it is to monitor processes in monotonous work situations. The brain is likely to be undernourished because nothing happens. One is not allowed to act, only to control and monitor. At the same time, the situation demands unfailing attention and preparedness to intervene.

Work demands of this kind are unavoidable in modern, complex defense systems, for example, people isolated in silos underground and people serving aboard submarines for long periods in tedious, unchanging routines. Hence, there is a great risk that signals will be overlooked, messages misinterpreted, and information distorted. Studies show an increasing tendency to commit errors during monotonous monitoring even during the first half hour.4

Now, consider what happens when a monotonous situation suddenly becomes critical. When something goes wrong, the person on duty must switch, instantaneously, from passive, routine monitoring to active problem solving. His task then is to quickly form a picture of the alarm signals, interpret their overall message, decide which measures to take, and carry them out.

This sudden switch from understimulation to overstimulation when something goes wrong, combined with emotional pressure, may cause temporary mental paralysis. During this brief but critical time interval, the person in charge may be incapable of making use of the available information. The consequences of such a mental paralysis - however brief - may be disastrous because of the narrow time margins. And the time margins for decision making in a crisis situation are steadily shrinking as the sophistication of nuclear weapons increases and the warning times become shorter.5

 

Performance during Crisis

Let us take a brief look at what is known about factors affecting skilled performance in crisis situations.

1. Attention narrowing: When our stress level rises, we develop tunnel vision. Important dimensions of the situation may be completely blocked out from conscious awareness.
2. Perceptual distortion: Messages tend to become distorted in the direction of our expectations. Such distortions occur, in particular, when stimuli are ambiguous, when past experience influences interpretations, and when wishful fantasies color what is perceived.
3. Mental rigidity: A related psychological phenomenon is loss of mental flexibility. Coping with the unfamiliar and the unexpected becomes even more difficult in a crisis. When people are under strong emotional pressure, their cognitive processes become rigid. Their ability to take in new information is reduced, particularly information which is not consistent with established beliefs. The ability to weigh alternative courses of action is impaired, as is the capacity to reevaluate conclusions. We know from the accident at Three Mile Island that the operators adhered rigidly to a picture of the system that did not tally with the facts.
4. Vigilance fluctuation: It is also significant that the accident at Three Mile Island took place about 4:00 a.m. It is well known that mental alertness is associated with the diurnal rhythm which characterizes most physiological processes. This rhythm adapts slowly to shifts in the pattern of sleeping and waking hours. For example, when a worker changes to the night shift, his adrenaline secretion - highly important for alertness - is at the bottom of its daily rhythm during working hours. Safety is seriously threatened when an operator on the graveyard shift is out of step with his daily rhythm. He cannot be expected to function at peak level during a crisis.

In summary, errors are perfectly normal during crises because of the built-in limitations of human beings. The narrowing of attention, perceptual distortions, mental inflexibility, and vigilance fluctuations discussed above are not psychological disorders in disturbed individuals. They are the normal human responses to severe strain. They are components of how we function and are not defects which can be remedied by training.

 

Decision Making in Groups

Let us shift from the psychology of accidents to the psychology of group processes, for example, in so-called "war cabinets." Yale University social psychologist Irving Janis uses the concept "group-think" to account for a way of thinking which easily takes hold of people who are deeply involved in decision making in closed and cohesive groups.6

The group-think phenomenon is likely to develop when the stakes are high and the time pressure intense, in short, when the pressure to reach rapid consensus becomes the overriding goal. To achieve unity in a crisis situation, members of a decision-making group often abandon their own critical judgment. This group process may lead to actions and decisions which the members would never have accepted as individuals. Six characteristics can be distinguished:

1. Illusion of invulnerability: The group starts viewing itself as perfect and immune from external dangers.
2. Ignoring and rationalizing information: This state is achieved by collective efforts to ignore information which challenges already accepted assumptions, and to rationalize away any indication that these assumptions might go wrong.
3. Moral superiority: One adopts an unquestioned belief in the group's inherent moral superiority.
4. Stereotyping: The enemy is stereotyped as either too stupid to be a threat or too evil for negotiations.
5. Illusion of unanimity: An illusion of unanimity is built, which fosters feelings of immunity from outside pressures. Thinking becomes oversimplified with a tendency to see everything in black-and-white terms.
6. Mind guards: Self-appointed "mind guards" protect the group from information that does not tally with the prevailing picture. These mind guards suppress any sign of latent disagreement among the group members.

In a group-think situation, there is a deep uncertainty about the opponent's intentions, a basic lack of trust. This is true of several political fiascoes of our time which can be understood in terms of the group-think syndrome. For an example, see Kringlen's discussion of the Bay of Pigs in this volume.

 

Concluding Comments

Technical systems are designed on the assumption that human performance remains intact during crises. Likewise, decision-making bodies operate on the assumption that their ability to make rational decisions is maintained under conditions of crisis. Contrary to both these assumptions, psychological evidence shows that emotional stress and time urgency impair performance and endanger the rationality of decision making in both individuals and groups. These psychological facts, combined with the decreasing time margins imposed by modern weapon systems, make the risk of nuclear war by mistake a very real one.

How is it possible that human beings, with their unequaled ability to plan and to predict, to choose and to control, have placed themselves in a predicament so hazardous that perfectly normal human errors can destroy the whole globe? Part of the answer is to be found in psychological defense mechanisms. The nuclear threat is collectively denied, because to face it would force us to face some aspects of the world's situation which we do not want to recognize.

 

"What is called for now is not more pseudo adaptation. On the contrary, we need people who respond by a 'healthy maladaptation' to the nuclear threat, strong enough to cause a revolt against the present course of development."

 

By denying the threat, one achieves a state of "pseudo adaptation," which kills our tendency to rebel. Pseudo adaptation is facilitated because the nuclear threat has grown through gradual escalation, a successive increase of weapons over decades. This has led to an emotional blunting. Feelings of distress and anxiety have faded away without eliciting corrective responses.

Yet another aspect of pseudo adaptation, closely related to emotional blunting, is the decrease of emotional involvement with increased distance in time and space. People show a lack of ability to become emotionally involved in problems which are not perceived as part of the present - problems perceived as belonging to the future. One of the strategies that we use for coping with our fear of nuclear war is to push it into the "non-involving future time zone," where its emotionally arousing quality is lost. We may acknowledge the risk, but we shut our eyes to its imminence.

What is called for now is not more pseudo adaptation. On the contrary, we need people who respond by a "healthy maladaptation" to the nuclear threat, strong enough to cause a revolt against the present course of development.

The nuclear era calls for a psychological reorientation, a change in human motivation with a new emphasis on involvement in future human welfare on a worldwide basis. Instead of resorting to a very dangerous coping strategy, we must learn to cultivate the greatest human resource: people's capacity for attachment and love. Human attachment is a strong force, capable of assuming mountain-moving proportions.

The challenge now is to help people extend their attachments, their loyalties, and their engagement, to include people outside their own narrow circle, their own country, their own imminent future. This global reorientation is a prerequisite for changing the present fatal course of development.

 

 

 

 

REFERENCES

* This paper is based on an invited address presented by the author at the first Congress of Psychologists for Peace, held in Helsinki, August 1986. Congress Proceedings, K. Helkama Editor, Helsinki, 1987.

1. David A. Hamburg, "The World Transformed: Critical Issues in Contemporary Human Adaptation," Mack Lipkin Man and Nature Lectures (New York: American Museum of Natural History, 1987).

2. Lloyd J. Dumas, "Human Fallibility and Weapons," Bulletin of the Atomic Scientists, Vol. 36 No. 9 (November 1980), pp. 15-20.

3. Marianne Frankenhaeuser and Gunn Johansson, "On the Psycho-Physiological Consequences of Understimulation and Overstimulation," in L. Levi, ed., Society, Stress and Disease, Vol. IV: Working Life (London and New York: Oxford University Press, 1981) pp. 82-89.

4. Donald E. Broadbent, Decision and Stress (London and New York: Academic Press, 1971).

5. Marianne Frankenhaeuser, "To Err Is Human: Psychological and Biological Aspects of Human Functioning," in Nuclear War by Mistake: Inevitable or Preventable? Report from an International Conference in Stockholm, February 15-16, 1985.

6. Irving Janis, Victims of Groupthink (Boston: Houghton Mifflin, 1972).

Main Book Contents Background Reflections Building on Breakthrough  
Contact Breakthrough      Foundation for Global Community      Copyright ©2001 FGC