Akira Kawasaki | Bulletin of Atomic Scientists
On August 9, 2011—66 years after the atomic bombing of Nagasaki but only about five months after the disaster at the Fukushima Daiichi nuclear power plant—Nagasaki Mayor Tomihisa Taue observed that, until Fukushima, many people had believed in the myth of safety at nuclear power plants. “But what about the more than 20,000 nuclear weapons in the world?” asked Taue. “Do we still believe that the world is safer thanks to nuclear deterrence? Do we still take it for granted that no nuclear weapons will ever be used again?”
Four years later, the Fukushima disaster remains in the news—more than 100,000 evacuees are unable to return home and the site still isn’t under control. Meanwhile, seven decades have passed since the Hiroshima and Nagasaki bombings and the world’s attention has shifted away from nuclear weapons. But the risks these weapons pose have not disappeared.
Human error, technical failure. In 2012, a commission mandated by Japan’s Diet to investigate the Fukushima disaster reported that the accident had been a “profoundly man-made disaster.” The commission concluded that both the plant’s operator and the national regulator had been aware that the Fukushima facility required structural reinforcement—but chose not to tackle the problem. Indeed, Japan’s government, industrial sector, and nuclear experts maintain a collusive relationship that has earned them a derisive nickname: the “nuclear village.” The International Atomic Energy Agency, in its May 2015 final report on Fukushima, criticized the plant’s operator for paying insufficient attention to “low probability, high consequence” events. This failure partly stemmed, the agency reported, from “the basic assumption in Japan, reinforced over many decades, that the robustness of the technical design of the nuclear plants would provide sufficient protection against postulated risks.”
Are nuclear weapons exempt from such blundering? No—there is every reason to believe that individuals responsible for nuclear weapon safety will exhibit, like other human beings, a reluctance to deal with difficult challenges, a tendency to turn away from inconvenient truths, and a simple capacity for error. US investigative journalist Eric Schlosser, in his 2013 book Command and Control, reported on the many serious accidents involving nuclear weapons, or “broken arrow” incidents, that have afflicted the US nuclear complex over the decades. Schlosser, arguing that there can be no definitive way to ensure that nuclear weapons are completely safe and secure, has called nuclear weapons the world’s “deadliest, most dangerous machines.”
Even if one assumes that nuclear weapons will never be deliberately used in wartime again—and such an assumption may be wishful thinking, given the number of unpredictable conflicts around the world today—grave concern still must surround the assumption that individuals who exercise command and control of nuclear weapons will never commit catastrophic errors that lead to a detonation. (Kyodo News reported this year that US missileers based in Okinawa, in the final phase of the Cuban Missile Crisis, had received a nuclear launch order—an order issued in error.) Nor can one assume that technical systems will never fail, whether from prosaic causes such as aging or more unlikely ones such as cyber attacks.
To be sure, the probability of a nuclear detonation is low. But the probability of an accident at Fukushima was supposed to be low as well. When the accident happened, the consequences were catastrophic.