Main Book Contents Background Reflections Building on Breakthrough  

SECTION ONE

Inevitability

Collision Course With Disaster

 

OVERVIEW

 


Institutional Collapse

Computer Error

Overlapping Errors

Instabilities in Systems without Error

Human Error

Rationality in Crisis?

Denial of Threat

Proliferation

The Cumulative Probability

 


 

 


 

Institutional Collapse

World War I was a disaster waiting to happen. An intricate network of interlocking alerts and mobilization plans required only a minor incident to trigger an uncontrollable political and military chain reaction. Today, the construction of fantastically complex nuclear command organizations in the US and the USSR parallels the interlocking military institutions built in the decade before 1914. Today's systems are sophisticated, tightly coupled, and quick reacting, so that the effect of a small perturbation can be amplified throughout the entire nuclear force system. The US and the USSR have thus institutionalized a system with a propensity for rapid escalation toward nuclear war. ("Instabilities in the Control of Nuclear Forces," Paul Bracken)

 

Computer Error

Today's nuclear forces could not function without high-speed computers to automate the warning process, control communications, and should it be deemed necessary, guide missiles to their targets. But computer systems can and do fail. Hardware, software, and design failures are common. Computers used in nuclear command and control are not only exceptionally complex but cannot be tested under conditions of actual use. Reasonable attempts to protect against failure by adding redundancy and backup actually add complexity on top of complexity, compounding the probability of malfunction. ("Computer System Reliability and Nuclear War," Alan Borning)

 

Overlapping Errors

With three nuclear false alarms in an average week, it is unlikely that a single false alarm will cause a nuclear war. They are too routine. But their high rate of occurrence creates a significant chance for overlapping false alarms which can be much more dangerous. To protect against a single system failure, both the US and the USSR require independent verification of an attack by satellite and radar systems. The probability, however, of overlapping false alarms in these two systems, triggering a nuclear war, is surprisingly high. ("Overlapping False Alarms: Reason for Concern?" Linn I. Sennott)

 

Instabilities in Systems without Error

There is a dangerous instability in computerized defense systems even if they are working perfectly. One can assume that all the nuclear warning software works without error, and that the hardware is fail-safe. Nevertheless, the combination of two such correctly functioning systems together is unstable. This is because secrecy prevents either system from knowing exactly what the other is doing, which means that any input which could be interpreted as a danger signal must be responded to by an increase in readiness on the receiving side. That readiness change, in turn, is monitored by the opposing side which then steps up its readiness, and so on. This feedback loop triggers an escalating spiral. There is therefore the possibility of an entirely unprovoked attack triggered by the interaction of two perfectly operating computer-based systems. ("Computer War," Boris V. Raushenbakh)

 

Human Error

To err is human in the best of times, but in times of crisis, it is quite likely. The evolution of our species has not prepared us for making extreme-risk decisions in ultra-short time frames, yet this is precisely what must be done when indication is received, right or wrong, of a nuclear attack. The brain functions poorly when understimulated, as in constant, repetitive monitoring at a missile silo or on a submarine which has been months at sea. On the other hand, high tension, which in the event of a sudden alert can follow immediately on the heels of boredom, can produce temporary mental paralysis. Further, group thinking is also highly unreliable when the stakes are high and the time pressure intense. Illusions of invulnerability and moral superiority promote irrational decision making. ("To Err Is Human: Nuclear War by Mistake?" Marianne Frankenhaeuser)

 

Rationality in Crisis?

When the chips are down and the pressure is intense, groups tend to act with increasing conformity. Independent judgment is forfeited for the sake of consensus, and the role of the leader is exaggerated for the sake of loyalty. The need for speed compromises the search for objective facts. These factors operated when President Kennedy and his normally brilliant advisors decided to support the disastrous 1961 Bay of Pigs invasion of Cuba. The risk of accidental nuclear war also depends on over 100,000 people who have contact with nuclear weapons, a surprising number of whom have been found to be dependent on alcohol or drugs. Human beings, whose rational behavior is counted on to provide the final and decisive check to prevent an unintended nuclear war, are Ñ especially in that moment of profound tension Ñ often irrational. ("The Myth of Rationality in Situations of Crisis," Einar Kringlen)

 

Denial of Threat

Surveys of young people in the US show that a significant number fear nuclear war. In the USSR the proportion is not quite as high, but still significant. Dreams, marriage, family, and career plans, all can be colored by this fear. Perhaps an even more serious danger is the denial among those who do not register the threat. ("Young People and Nuclear War," Stanislav K. Roshchin and Tatiana S. Kabachenko)

 

Proliferation

It is not hard to learn how to make nuclear weapons, nor are they difficult to manufacture and assemble. The knowledge is widespread. The most difficult part of the process is making, or obtaining, the nuclear material. Safeguards are designed to keep such materials from spreading, being sold on the international market, being stolen, or being taken in terrorist raids. The worldwide spread of civilian nuclear power reactors, however, has produced "latent proliferation," the ability to produce nuclear weapons in short order, in over thirty countries. By the year 2000, there will be enough plutonium from such reactors for at least 500,000 nuclear weapons. The spread of such material and the low level of security which is possible in multiple locations substantially increases the probability that the materials will be accessible by states or individuals who do not agree to be bound by nonproliferation treaties or any other international guarantees. ("Proliferation of Nuclear Weapons," Theodore B. Taylor)

 

The Cumulative Probability

There is only a small likelihood that any one of the causes of nuclear war which have been described in these chapters will trigger nuclear war. The probability can be compared to risks associated with "pistol roulette" in which one chamber of a many-chambered gun is loaded, the cylinder spun, the gun put to the head, and the trigger pulled. Each time the trigger is pulled, there is only a small chance that the gun will go off. But if the trigger is pulled often enough, the probability approaches certainty that the gun will eventually fire. Whether from the escalation of interlocking war mobilization plans; whether from human error, or group dynamics and the lack of independent judgment in time of crisis; whether from computer error, or computers functioning correctly but in an escalating feedback loop; or whether because of nuclear proliferation by states or by terrorists who have obtained the materials illegally, if we do not change our course it is inevitable that nuclear weapons will eventually be used. The only way to alter the inevitability is to change the mentality which is the source of all these factors, that is, to eliminate the mentality of war. ("Nuclear War: Inevitable or Preventable?" Martin E. Hellman)

 

Main Book Contents Background Reflections Building on Breakthrough  
Contact Breakthrough      Foundation for Global Community      Copyright ©2001 FGC