Time to nuclear Armageddon
- This article is the narrative basis for the accompanying video of presentation at the Joint Statistical Meetings 2019-08-01. It is on Wikiversity to invite further discussion, expansion, correction, and revision of the narrative presented here subject to the standard Wikimedia rules of writing from a neutral point of view citing credible sources.
This work was inspired by Daniel Ellsberg's 2017 book, The Doomsday Machine. In this book Ellsberg says that as long as the world maintains large nuclear arsenals, it is only a matter of time before there is a nuclear war, which he claims will almost certainly lead to a nuclear winter that lasts over a decade, during which 98 percent of humanity will starve to death if they do not die of something else sooner.^{[1]}^{[2]}
Ellsberg's claims suggest statistical questions regarding the probability distribution of the time to a nuclear war and the severity of the consequences.
The following outlines a methodology for addressing these statistical questions. Previous estimates of the probability of a nuclear war in the next year range from 1 chance in a million to 7 percent, with 0.7 percent being offered by the Good Judgment Project, which arguably uses the best known methodology for making such estimates. If that rate is assumed to have been constant over the 70 years since the first test of a nuclear weapon by the Soviet Union in 1949, these estimates of the probability of a nuclear war in 70 years range from 70 chances in a million to 99 percent. The Good Judgment answer translates into a 40 percent chance of such a war in 70 years, past or future, or equivalently 20 chances in a million that the next 24 hours might see the initiation of a crisis that leads to a nuclear war.
Moreover, nuclear proliferation is continuing. This suggests that the probability of a nuclear war and winter is likely increasing and will continue to increase until something happens to make it effectively impossible for anyone to make more nuclear weapons for a very long time. Two possible scenarios might produce such a nuclear disarmament:
- A nuclear war and winter ending civilization.
- An unprecedented international movement that strengthens international law to the point that the poor and disfranchised have effective nonviolent means for pursuing a redress of grievances.
This article ends with an outline of possible future research in this area.
Methodology[edit]
We suggest here the following methodology:
- 1. Select a list of incidents.
- 2. Model the time between such incidents.
- 3. Estimate subjective probabilities for (a) an essentially equivalent repetition of the same incident leading to a nuclear war, and (b) the distribution of the severity of the consequences of the war. And
- 4. Combine “2” and “3” into compelling communications.
Someone attacked item number “3” saying, “You, Spencer Graves, are willing to speculate. That's just a rank speculation. I am not willing to speculate.”
My response is an unwillingness to speculate is essentially equivalent to saying that the probability is zero, and I think that is an unrealistic speculation.
A prototype use of this methodology considers only two incidents:
- (1) The 1962 Cuban Missile Crisis, and
- (2) The 1983 Soviet nuclear false alarm incident.
President Kennedy, who was the US President during the Cuban Missile Crisis, said that there was a probability of between a third and a half that that incident would have gone to a nuclear war. He died before learning that Soviet nuclear weapons were in Cuba at that time. The crisis ended less than 48 hours before a planned invasion by the US, predicated on the belief that there were no such weapons in Cuba at that time.^{[3]} At a 30th anniversary conference in 1992, Fidel Castro (Cuban head of state in 1962) told Robert McNamara (US Secretary of State in 1962) that if the US had invaded, those nuclear weapons would have been used, even though Castro knew that not one person on Cuba would survive.^{[4]}
The 1983 Soviet nuclear false alarm incident occurred when US President Ronald Reagan was building up the US military and challenging the Soviets. Andropov, the Soviet Premier, and his inner circle believed that the US was preparing for a nuclear first attack.
This gives us one observation of = 21 years of the time between the 1962 Cuban Missile Crisis and the 1983 Soviet nuclear false alarm incident. In addition, the time to the next incident of a similar magnitude is censored at the = 36 years between the 1983 Soviet nuclear false alarm incident and 2019, as this is being written. Standard statistical theory says that the likelihood for these two observations is the product of the density at and the survival function at :
- .
It seems reasonable to assume, at least for an initial demonstration of this methodology, an exponential distribution. This means the likelihood is as follows:
- .
To the extent that this is accurate, it says that the maximum likelihood estimate of the mean time to the next comparable nuclear crisis is [(21 + 36) divided by 1] = 57 years.
- = 57.
We can get an equivalent answer by exploiting the well-known duality between exponential and Poisson distributions by considering this history as consisting one Poisson distributed observation on the number of such incidents in each of the 57 years between 1962 and this writing in 2019: We have one such incident in 1983 and 0 in the other 56 years. The likelihood for this formulation is as follows:
- .
This is maximized with = 1/57 = 0.018 such incidents per year.
The Poisson formulation is useful, because it is easier to consider non-constant hazard. The glm function in the R (programming language) can easily model a liner relationship between and the time since the very first test of a nuclear weapon by the United States in 1949. Moreover, the bssm package for R can model a normal random walk of log(Poisson mean). These options will will not be pursued here but might be useful in future work, either with a larger list of incidents or with nuclear proliferation, discussed below.
Relevant literature[edit]
Simon Beard^{[5]} shared the following literature review of studies estimating something like the probability of a nuclear war in the next year, which he compiled in joint with Tom Rowe of Virginia Tech, and James Fox at the University of Oxford.^{[6]} Beard's analysis is augmented here with the probability of a nuclear war in the 70 years between the first test of a nuclear weapon by the Soviet Union (now Russia) in 1949 and the time that this is being written in 2019. It is augmented also by columns translating the annual probabilities in to the number of chances in a million (parts per million, ppm) that a crises leading to a nuclear war will begin on any given day.
The 70-year numbers use the fact that if there is a constant probability of a nuclear war in a given year, the probability of at least one nuclear war in 70 years is . The upper limit of 7% for the probability of a nuclear war in the next year (Barrett et al., 2013) is clearly not plausible as a constant probability of a nuclear war each year during that period: Otherwise the probability that we would already have had one is 99%.
Source | Probability of a nuclear war | |||||
---|---|---|---|---|---|---|
annualized | in 70 years | daily (ppm) | ||||
lower | upper | lower | upper | lower | upper | |
Hellman (2008) | 0.02% | 0.5% | 1% | 30% | 0.5 | 14 |
Barrett et al (2013) | 0.0001% | 7% | 0.007% | 99% | 0.003 | 200 |
Lundgren (2013) | 1.4% | 60% | 40 | |||
Project for the Study of the 21st Century (2015) | 0.3% | 20% | 8 | |||
Good Judgment Project (2018) | 0.7% | 40% | 20 | |||
Turchin (2010) | 0.5% | 30% | 14 | |||
Pamlin and Armstrong (2015)^{[7]} | 0.1% | 7% | 3 | |||
Sandberg and Bostrom (2008)^{[8]} | 0.4% | 24% | 11 |
Similarly, the ppm numbers can be interpreted as equivalent to suggesting that each new day the leaders of the nuclear-weapon states roll the cylinder and pull the trigger in a game of Russian Roulette with the indicated chance of the result being a nuclear war.
It seems useful to highlight the Good Judgment Project (2018), because it uses a methodology developed by a 20-year project funded in part by the Intelligence Advanced Research Projects Agency and documented in Tetlock and Gardner (2015). Their methodology produced 30% better forecasts than intelligence agents with access to classified information. It is as follows:
- Recruit volunteers and ask them a series of forecasting questions, like estimating the probability of a certain event in a specific time period (typically 1, 2 or 3 years).
- Identify the volunteers with the best forecasts.
- Organize them into teams.
- Study what the best teams did.
The result is documented in Tetlock and Gardner (2015). This methodology might potentially be crowdsourced on a platform like Wikidata, Wikiversity and Wikipedia.
The 0.7 percent chance of a nuclear war starting in the coming year estimated by the Good Judgment Project is equivalent to a 40 percent chance in 70 years and 20 chances in a million that it will start in the next 24 hours.
Other leading figures supporting Ellsberg's claims[edit]
Ellsberg is not alone in his concern about this. Robert McNamara also said that as long as the world has large nuclear arsenals, it's only a question of time before there is a nuclear war.^{[9]} Similar concerns led former US Senator Sam Nunn and media executive Ted Turner to found the Nuclear Threat Initiative, also supported by former US Secretary of Defense William J. Perry, and former US Secretaries of State Henry Kissinger and George Shultz.
Atmospheric scientists Owen Toon, Alan Robock et al. (2017) have estimated that a relatively minor nuclear war between India and Pakistan could involve at least 100 nuclear weapons, leading to a nuclear autumn during which two billion people -- just over a quarter of humanity -- not involved in the nuclear exchange would starve to death.
A hundred nuclear weapons is only about 2 percent of the US nuclear arsenal. A nuclear war involving the US would likely be closer to Ellsberg's doomsday scenario than the two billion dead mentioned by Toon, Robock et al. (2017).
Nuclear proliferation[edit]
The fact that nuclear proliferation is continuing suggests that any model that assumes that the risk of a nuclear war is constant or declining is probably wrong. When the Nonproliferation Treaty treaty took effect in 1970, there were 5 nuclear weapon states. When US President George W. Bush announced an “axis of evil” consisting of North Korea, Iran and Iraq on 2002-01-28, there were 8. As this is being written in 2019, there are 9. As long as nuclear weapon states continue to threaten countries without them, the pressure for nuclear proliferation will continue, and the risks of a nuclear war will likely grow.
year | number of nuclear-weapon states | event |
---|---|---|
1970 | 5 | Nonproliferation Treaty treaty |
2002 | 8 | Axis of evil speech by US President Bush condemning North Korea, Iran and Iraq |
2006 | 9 | First test of a nuclear weapon by North Korea |
Future work[edit]
It is relatively easy to use the glm function in the R (programming language) to model a random walk in the log(Poisson mean) of the number of first-tests of new nuclear-weapon states each year.
Beyond this, it could be useful to try to expand the present study to consider larger lists of incidents threatening nuclear war. For this purpose, it might be useful to try to recruit volunteers to use Wikimedia Foundation projects, especially Wikipedia, Wikiversity, and Wikidata to produce estimates like this using the methodology of the Good Judgment Project (2018) described in Tetlock and Gardner (2017). Wikipedia already does something like this: Peter Binkley in an invited 2006 article for a Canadian Library Association journal said that on controversial topics "the two sides actually engaged each other and negotiated a version of the [Wikipedia] article that both can more or less live with. This is a rare sight indeed in today’s polarized political atmosphere, where most online forums are echo chambers for one side or the other”.^{[10]}
Another potentially useful project could be to write an R function to convert probability distributions generated by models like those discussed here estimates of the probability that a person of any age, especially a child born today, would die prematurely from a nuclear war. Stanford Engineering Professor Emeritus Martin Hellman has estimated that the probability is at least 10 percent that a child born today would die prematurely from a nuclear war.^{[11]} These kinds of analyses might help a broader audience understand the seriousness of this issue.
See also[edit]
- Time to extinction of civilization, which provides more detail behind part of this present discussion.
- Forecasting nuclear proliferation, which predicts continuing increases in the number of nuclear weapon states, thereby seemingly increasing the threat of nuclear war and Armagedon.
- Confirmation bias and conflict, which explains how the mainstream media everywhere exploit basic features of human psychology to enhance the social status of those who control media funding and governance in ways that threaten the extinction of civilization.
References[edit]
- Anthony M. Barrett; Seth Baum; Kelly Hostetler (May 2013), "Analyzing and Reducing the Risks of Inadvertent Nuclear War Between the United States and Russia", Science and Global Security, 21 (2), Wikidata Q66151664^{[12]}
- Good Judgment Project (2018), nonpublic data, cited in comment 2018-08-06 by Carl Shulman to Rhys Lindmark (6 August 2018), "Current Estimates for Likelihood of X-Risk?", Effective Altruism Forum, Wikidata Q66730943.
- Martin Edward Hellman (May 2008), "Risk Analysis of Nuclear Deterrence", The Bent of Tau Beta Pi, Wikidata Q66156663
- Carl Lundgren (27 June 2013), "What are the odds? Assessing the probability of a nuclear war", Nonproliferation Review, 20 (2): 361–374, Wikidata Q66149076^{[12]}
- Experts see rising risk of nuclear war: survey, Project for the Study of the 21st Century, 12 November 2015, Wikidata Q66179978
- Dennis Pamlin; Stuart Armstrong (2015), Global Challenges: 12 Risks that Threaten Human Civilisation, Global Challenges Foundation, Wikidata Q66202646
- Anders Sandberg; Nick Bostrom (2008), Global Catastrophic Risks Survey, Wikidata Q66210566
- Philip E. Tetlock; Dan Gardner (2015), Superforecasting: The art and science of prediction, Crown Publishing Group, Wikidata Q21203378
- Owen B. Toon; Alan Robock; Michael Mills; Lili Xia (2017), "Asia Treads the Nuclear Path, Unaware That Self-Assured Destruction Would Result from Nuclear War", The Journal of Asian Studies, 76 (02): 437–456, doi:10.1017/S0021911817000080, Wikidata Q58262021
- Alexei V. Turchin (8 August 2010), Structure of the global catastrophe. Risks of human extinction in the XXI century, Lulu.com, Wikidata Q66181092
Notes[edit]
- ↑ Daniel Ellsberg; Amy Goodman; Juan González (6 December 2017), Daniel Ellsberg Reveals He was a Nuclear War Planner, Warns of Nuclear Winter & Global Starvation, Democracy Now!, Wikidata Q64226035
- ↑ A version of this article is scheduled to appear in Spencer Graves (2019), "Time to nuclear Armageddon", JSM proceedings, Wikidata Q66918248. More on this appears in Time to extinction of civilization.
- ↑ Daniel Ellsberg (2017), The Doomsday Machine: Confessions of a Nuclear War Planner, Bloomsbury Publishing, Wikidata Q63862699, p. 206.
- ↑ Robert McNamara; James G. Blight (2003), Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century, PublicAffairs, Wikidata Q64736611, pp. 189-190.
- ↑ Simon Beard, Wikidata Q64708568
- ↑ cited from private communication from Simon Beard. The numbers here correct minor errors in the corresponding slide in the accompanying video, commons:File:Graves-JSM2019-08-01.webm
- ↑ p. 16/212: “The likelihood of a full-scale nuclear war between the USA and Russia has probably decreased. Still, the potential for deliberate or accidental nuclear conflict has not been removed, with some estimates putting the risk in the next century or so at around 10%”. This makes the risk in 1 year of [1-.9^(1/100)] = 0.001053, and the risk in 70 years = 0.071, ignoring their comment that “The likelihood ... has probably decreased” and ignoring the chances of a nuclear war involving other nuclear weapons diads. Later, they write, “Based on available assessments the best current estimate for nuclear war within the next 100 years is 5% for infinite threshold [and] 0.005% for infinite impact” (p. 148).
- ↑ p. # 1 (p. 2 of 6 in the pdf): 30% chance of “at least 1 million dead” “total killed in all nuclear wars” by 2100 from 2008.
- ↑ Robert McNamara; James G. Blight (2003), Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century, PublicAffairs, Wikidata Q64736611
- ↑ Peter Binkley (2006), "Wikipedia Grows Up", Feliciter (2): 59–61, Wikidata Q66411582
- ↑ Christine Blackman (17 July 2009), "Chance of nuclear war is greater than you think: Stanford engineer makes risk analysis", Stanford News, Wikidata Q66424609
- ↑ ^{12.0} ^{12.1} cited from Simon Beard (15 September 2017), "Less Hollywood, More Car Crash", Centre for Research in the Arts, Social Sciences and Humanities, University of Cambridge, Wikidata Q66147141