Time to nuclear Armageddon
- This article is the narrative basis for the accompanying video of presentation at the Joint Statistical Meetings 2019-08-01. It is on Wikiversity to invite further discussion, expansion, correction, and revision of the narrative presented here subject to the standard Wikimedia rules of writing from a neutral point of view citing credible sources.
This work was inspired by Daniel Ellsberg's 2017 book, The Doomsday Machine. In this book Ellsberg says that as long as the world maintains large nuclear arsenals, it is only a matter of time before there is a nuclear war, which he claims will almost certainly lead to a nuclear winter that lasts over a decade, during which 98 percent of humanity will starve to death if they do not die of something else sooner.
Ellsberg's claims suggest statistical questions regarding the probability distribution of the time to a nuclear war and the severity of the consequences.
The following outlines a methodology for addressing these statistical questions, reviews relevant literature, mentions other leading figures supporting Ellsberg's claims, and notes that nuclear proliferation is continuing, before outlining future work.
We suggest here the following methodology:
- 1. Select a list of incidents.
- 2. Model the time between such incidents.
- 3. Estimate subjective probabilities for (a) an essentially equivalent repetition of the same incident leading to a nuclear war, and (b) the distribution of the severity of the consequences of the war. And
- 4. Combine “2” and “3” into compelling communications.
Someone attacked item number “3” saying, “You, Spencer Graves are willing to speculate. That's just a rank speculation. I am not willing to speculate.”
My response is an unwillingness to speculate is essentially equivalent to saying that the probability is zero, and I think that is an unrealistic speculation.
A prototype use of this methodology considers only two incidents:
President Kennedy, who was the US President during the Cuban Missile Crisis, said that there was a probability of between a third and a half that that incident would have gone to a nuclear war. He died before learning that Soviet nuclear weapons were in Cuba at that time. The crisis ended less than 48 hours before a planned invasion by the US, predicated on the belief that there were no such weapons in Cuba at that time. At a 30th anniversary conference in 1992, Fidel Castro (Cuban head of state in 1962) told Robert McNamara (US Secretary of State in 1962) that if the US had invaded, those nuclear weapons would have been used, even though Castro knew that not one person on Cuba would survive.
The 1983 Soviet nuclear false alarm incident occurred when US President Ronald Reagan was building up the US military and challenging the Soviets. Andropov, the Soviet Premier, and his inner circle believed that the US was preparing for a nuclear first attack.
This gives us one observation of = 21 years of the time between the 1962 Cuban Missile Crisis and the 1983 Soviet nuclear false alarm incident. In addition, the time to the next incident of a similar magnitude is censored at the = 36 years between the 1983 Soviet nuclear false alarm incident and 2019, as this is being written. Standard statistical theory says that the likelihood for these two observations is the product of the density at and the survival function at :
It seems reasonable to assume, at least for an initial demonstration of this methodology, an exponential distribution. This means the likelihood is as follows:
To the extent that this is accurate, it says that the maximum likelihood estimate of the mean time to the next comparable nuclear crisis is 21 + 36 divided by 1 = 57 years.
- = 57.
We can get an equivalent answer by exploiting the well-known duality between exponential and Poisson distributions by considering this history as consisting one Poisson distributed observation on the number of such incidents in each of the 57 years between 1962 and this writing in 2019: We have one such incident in 1983 and 0 in the other 56 years. The likelihood for this formulation is as follows:
This is maximized with = 1/57 = 0.018 such incidents per year.
The Poisson formulation is useful, because the bssm package for the R (programming language) can model a normal random walk of log(Poisson mean). This will not be pursued here but could be useful in future work, either with a larger list of incidents or with nuclear proliferation, discussed below.
Simon Beard shared the following literature review of studies estimating something like the probability of a nuclear war in the next year, which he compiled in joint with Tom Rowe of Virginia Tech, and James Fox at the University of Oxford. Beard's analysis is augmented here with the probability of a nuclear war in the 70 years between the first test of a nuclear weapon by the Soviet Union (now Russia) in 1949 and the time that this is being written in 2019. This uses the fact that if there is a constant probability of a nuclear war in a given year, the probability of at least one nuclear war in 70 years is . The upper limit of 7% for the probability of a nuclear war in the next year (Barrett et al., 2013) is clearly not plausible as a constant probability of a nuclear war each year during that period: Otherwise the probability that we would already have had one is 99%.
|Source||Probability of a nuclear war|
|annualized||in 70 years|
|Barrett et al (2013)||0.0001%||7%||0.007%||99%|
|Project for the Study of the 21st Century (2015)||0.3%||18%|
|Good Judgment Project (2018)||0.7%||40%|
|Pamlin and Armstrong (2015)||0.1%||7%|
|Sandberg and Bostrom (2008)||0.4%||25%|
It seems useful to highlight the Good Judgment Project (2018), because it uses a methodology developed by a 20-year project funded by the Intelligence Advanced Research Projects Agency and documented in Tetlock and Gardner (2015). Their methodology produced 30% better forecasts than intelligence agents with access to classified information. It is as follows:
- Recruit volunteers and ask them a series of forecasting questions, like estimating the probability of a certain event in a specific time period (typically 1, 2 or 3 years).
- Identify the volunteers with the best forecasts.
- Organize them in teams.
- Study what the best teams did.
The result is documented in Tetlock and Gardner (2015). This methodology might potentially be crowdsourced on a platform like Wikidata, Wikiversity and Wikipedia.
Other leading figures supporting Ellsberg's claims
Ellsberg is not alone in his concern about this. Robert McNamara also said that as long as the world has large nuclear arsenals, it's only a question of time before there is a nuclear war. Similar concerns led former US Senator Sam Nunn and media executive Ted Turner to found the Nuclear Threat Initiative, also supported by former US Secretary of Defense William J. Perry, and former US Secretaries of State Henry Kissinger and George Schultz.
Atmospheric scientists Owen Toon, Alan Robock et al. (2017) have estimated that a relatively minor nuclear war between India and Pakistan could involve at least 100 nuclear weapons, leading to a nuclear autumn during which two billion people not involved in the nuclear exchange would starve to death.
A hundred nuclear weapons is only about 2 percent of the US nuclear arsenal. A nuclear war involving the US would likely be closer to Ellsberg's doomsday scenario than the two billion dead mentioned by Toon, Robock et al. (2017).
The fact that nuclear proliferation is continuing suggests that any model that assumes that the risk of a nuclear war is constant or declining is probably wrong. When the Nonproliferation Treaty treaty took effect in 1970, there were 5 nuclear weapon states. When US President George W. Bush announced an “axis of evil” consisting of North Korea, Iran and Iraq on 2002-01-28, there were 8. As this is being written in 2019, there are 9. As long as nuclear weapon states continue to threaten countries without them, the pressure for nuclear proliferation will continue, and the risks of a nuclear war will likely grow.
|year||number of nuclear-weapon states||event|
|1970||5||Nonproliferation Treaty treaty|
|2002||8||Axis of evil speech by US President Bush condemning North Korea, Iran and Iraq|
|2006||9||First test of a nuclear weapon by North Korea|
It should be relatively easy to use the bssm package for the R (programming language) to model a random walk in the log(Poisson mean) of the number of first-tests of new nuclear-weapon states each year.
Beyond this, it could be useful to try to crowdsource assessments for a larger list of incidents threatening nuclear war using Wikimedia Foundation projects, especially Wikipedia, Wikiversity, and Wikidata.
Stanford Engineering Professor Emeritus Martin Hellman has estimated that the probability is at least 10 percent that a child born today would die prematurely from a nuclear war. It would be useful to write an R function to convert probability distributions generated by these kinds of models into estimates of the probability that a person of any age, especially a child born today, would die prematurely from a nuclear war.
Time to extinction of civilization, which provides more detail behind part of this present discussion.
- Anthony M. Barrett; Seth Baum; Kelly Hostetler (May 2013), "Analyzing and Reducing the Risks of Inadvertent Nuclear War Between the United States and Russia", Science and Global Security, 21 (2), Wikidata Q66151664
- Good Judgment Project (2018), nonpublic data, cited in comment 2018-08-06 by Carl Shulman to Rhys Lindmark (6 August 2018), "Current Estimates for Likelihood of X-Risk?", Effective Altruism Forum, Wikidata Q66730943.
- Martin Edward Hellman (May 2008), "Risk Analysis of Nuclear Deterrence", The Bent of Tau Beta Pi, Wikidata Q66156663
- Carl Lundgren (27 June 2013), "What are the odds? Assessing the probability of a nuclear war", Nonproliferation Review, 20 (2): 361–374, Wikidata Q66149076
- Experts see rising risk of nuclear war: survey, Project for the Study of the 21st Century, 12 November 2015, Wikidata Q66179978
- Dennis Pamlin; Stuart Armstrong (2015), Global Challenges: 12 Risks that Threaten Human Civilisation, Global Challenges Foundation, Wikidata Q66202646
- Anders Sandberg; Nick Bostrom (2008), Global Catastrophic Risks Survey, Wikidata Q66210566
- Philip E. Tetlock; Dan Gardner (2015), Superforecasting: The art and science of prediction, Crown Publishing Group, Wikidata Q21203378
- Owen B. Toon; Alan Robock; Michael Mills; Lili Xia (May 2017), "Asia Treads the Nuclear Path, Unaware That Self-Assured Destruction Would Result from Nuclear War", The Journal of Asian Studies, 76 (02): 437–456, doi:10.1017/S0021911817000080, Wikidata Q58262021
- Alexei V. Turchin (8 August 2010), Structure of the global catastrophe. Risks of human extinction in the XXI century, Lulu.com, Wikidata Q66181092
- * Ellsberg, Daniel; Goodman, Amy; González, Juan (2017-12-06), Daniel Ellsberg Reveals He was a Nuclear War Planner, Warns of Nuclear Winter & Global Starvation, Democracy Now, retrieved 2017-12-06.
- A version of this article is scheduled to appear in Spencer Graves (2019), "Time to nuclear Armageddon", JSM proceedings, Wikidata Q66918248. More in this appears in Time to extinction of civilization.
- Daniel Ellsberg (2017), The Doomsday Machine: Confessions of a Nuclear War Planner, Bloomsbury Publishing, Wikidata Q63862699, p. 206.
- Robert McNamara; James G. Blight (2003), Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century, PublicAffairs, Wikidata Q64736611, pp. 189-190.
- Simon Beard, Wikidata Q64708568
- cited from private communication from Simon Beard. The numbers here correct minor errors in the corresponding slide in the accompanying video, commons:File:Graves-JSM2019-08-01.webm
- p. 16/212: “The likelihood of a full-scale nuclear war between the USA and Russia has probably decreased. Still, the potential for deliberate or accidental nuclear conflict has not been removed, with some estimates putting the risk in the next century or so at around 10%”. This makes the risk in 1 year of [1-.9^(1/100)] = 0.001053, and the risk in 70 years = 0.071, ignoring their comment that “The likelihood ... has probably decreased” and ignoring the chances of a nuclear war involving other nuclear weapons diads. Later, they write, “Based on available assessments the best current estimate for nuclear war within the next 100 years is 5% for infinite threshold [and] 0.005% for infinite impact” (p. 148).
- p. # 1 (p. 2 of 6 in the pdf): 30% chance of “at least 1 million dead” “total killed in all nuclear wars” by 2100 from 2008.
- Robert McNamara; James G. Blight (2003), Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century, PublicAffairs, Wikidata Q64736611
- Christine Blackman (17 July 2009), "Chance of nuclear war is greater than you think: Stanford engineer makes risk analysis", Stanford News, Wikidata Q66424609
- cited from Simon Beard (15 September 2017), "Less Hollywood, More Car Crash", Centre for Research in the Arts, Social Sciences and Humanities, University of Cambridge, Wikidata Q66147141