Time to extinction of civilization
|This is a research project at Wikiversity.|
- This essay is on Wikiversity to encourage a wide discussion of the issues it raises moderated by the Wikimedia rules that invite contributors to “be bold but not reckless,” contributing revisions written from a neutral point of view, citing credible sources -- and raising other questions and concerns on the associated '“Discuss”' page.
|Khrushchev, Soviet head of state, 1953-1964||Kennedy, US President, 1961-63|
|Castro, Cuban head of state, 1959-2008||McNamara, US Secretary of Defense, 1961-68|
Robert McNamara and Daniel Ellsberg both stated that as long as major powers maintain large nuclear arsenals, it is only a matter of time before a nuclear incident generates runaway demands for vengeance, which will likely end only after nearly all the nuclear stockpiles are exhausted. McNamara was Secretary of Defense during the 1962 Cuban Missile Crisis, and Ellsberg was an expert in nuclear war strategy advising McNamara at that time. Both have researched and commented on this since then.
This article presents a methodology for estimating the probability distribution of the time to extinction of civilization.
- In brief, this analysis estimates a probability of roughly 11 percent that sometime in the next 40 years, within the expected lives of roughly half of the people alive today, a nuclear war will occur that will lead to the deaths of 98 percent of humanity and destroy civilization in the process -- a nuclear Armageddon. Similarly, the probability of such an event in the next 60 years (before the end of the twenty-first century as this is being written), is roughly 19 percent.
- This risk will be essentially eliminated if the world's large nuclear arsenals are destroyed first.
These estimates are obtained by straightforward application of standard methods for censored data analysis of the time between events, with one observed time of 21 years between the 1962 Cuban Missile Crisis and the 1983 Soviet nuclear false alarm incident, and with the time to the next such event being longer than (censored at) the 36 years between 1983 and 2019 as this is being written, and assuming a probability averaging 0.45 that a comparable incident in the future would produce a nuclear Armageddon.
However, the public might not know about the 1983 Soviet nuclear false alarm incident if the Soviet Union had not collapsed: That information could still be classified “Top Secret” in the Kremlin. There may well have been similar incidents in the US that are still classified. If so, the estimates discussed here could be quite conservative.
The uncertainty in these estimated is substantial. However, even considering this uncertainty, this suggests that
- the greatest threat to the national security of each of the major nuclear powers and to the future of humanity more generally is its own nuclear arsenal,
as implied by the publications of McNamara, Ellsberg, and others.
The entire methodology is described herein. Readers are invited to conduct any modification they feel appropriate and report the results with this article or on the companion “Discuss” page, preferably citing sources for improvements on the analysis described here.
- 1 Background
- 2 Normal or system accidents
- 3 Time between nuclear crises
- 4 Probability that a nuclear crisis would generate a major nuclear war
- 5 Probability that a major nuclear war might lead to the extinction of civilization
- 6 Fermi's paradox
- 7 Other existential risks
- 8 How could the situation be this bad with the current status of press freedom in the world?
- 9 What can concerned individuals do about this?
- 10 Appendix. Relevant statistical theory and computations
- 11 See also
- 12 References
- 13 Notes
This article discusses the threat posed by large nuclear arsenals. In this effort, we describe a methodology for estimating the probability distribution of the time to extinction of civilization.
Ellsberg (2017) said he was a nuclear war planner from 1958 to 1971 for US presidents Eisenhower, Kennedy, Johnson, and Nixon. In 1961 early in the administration of John Kennedy, the new president (at Ellsberg's suggestion) asked the military how many people in Russia and China would die in a nuclear war. Answer: 275 million in the initial exchange, with another 50 million dying of radiation poisoning over the next six months. Kennedy then asked (again at Ellsberg's suggestion) about Eastern Europe, then Western Europe, then the US. Answer: In a nuclear war, a third of humanity would die, most in the initial exchange and the rest from radiation poisoning over the next six months.
That does not consider nuclear winter, which was not seriously discussed until the 1980s. Ellsberg believes that a nuclear winter would almost certainly follow a major nuclear war, and 98 percent of the survivors from the initial exchange would die of starvation as the earth experiences other years without summers from soot from firestorms lofted into the stratosphere, where it will stay for years, possibly decades, because few rain clouds form there. This soot could block 70 percent of the sunlight, killing nearly all the vegetation. Most animals, including humans, would also die.
Normal or system accidents
In The Limits of Safety, Scott Sagan documented numerous incidents in the post World War II period that have been described as posing a high risk of nuclear war by accident or miscalculation. Sagan claimed that these incidents reflect the virtual impossibility of operating any sufficiently complex system to ultra-high levels of reliability, a phenomenon known as “normal accidents" or "system accidents".
The basic problem was described by a college classmate, who came to school with one hand all bandaged up. He said he had defeated the safety interlock on his Cuisinart.
This silly story seems similar to the miscalculations that led the Soviet Union to put nuclear weapons on Cuba and led the Kennedy administration to depth charge a Soviet submarine and prepare to invade Cuba on the mistaken belief that they would not be facing nuclear retaliation in doing so, as discussed in the next section. It also seems similar to what might have happened if the 1983 Soviet nuclear false alarm had been reported as a real attack to the Kremlin, also discussed in the next section.
Time between nuclear crises
For the purpose of the present analysis, we assume that the world has seen two nuclear crises that almost resulted in a nuclear war like just outlined: The Cuban Missile Crisis of 1962 (October 16-28) and the September 26, 1983, Soviet nuclear false alarm incident.
- We also assume that the distribution of the time between such incidents is constant.
Some other list of nuclear incidents could be used. For example, as of 2018-12-23, the Wikipedia article on “List of nuclear close calls” described 11 incidents, being the Cuban Missile Crisis, the 1983 Soviet nuclear false alarm incident, and nine others. Some of these other incidents might have escalated to a nuclear Armageddon.
For simplicity using only these two best known incidents, we will assume that the probability that each incident would escalate to Armageddon is fixed. With a larger number of incidents, we might make that probability a random variable. That would complicate the analysis. That option may be pursued in some future extension to this article.
We also assume that all such major incidents are known, which seems foolish: The 1983 Soviet nuclear false alarm incident only became public knowledge several years after the collapse of the Soviet Union, and similar such incidents could have occurred in the US military and the events could still be classified “Top Secret”. A future update to this article may also try to model the time from each such crisis to publication of information about it.
Standard data analysis of an observed time of 21 years and another time that's longer than (censored at) 36 years produces an estimate of mean time between such crises of 57 years, as outlined in Appendix 1 below.
Probability that a nuclear crisis would generate a major nuclear war
We want to build this into an estimate of the probability distribution of the time to extinction of human civilization. To do this, we combined the (random) time to the next nuclear crisis with a random variable indicating whether that next nuclear crisis would escalate into the extinction of civilization.
Probability that another event like the Cuban Missile Crisis might lead to Armageddon
To estimate this, it's useful to consider McNamara's report of his conversation with Fidel Castro in 1992 during a conference on the thirtieth anniversary of the Cuban Missile Crisis. In October 1962 there were 162 Soviet nuclear warheads in Cuba, which the US government did not believe were there. At that 1992 conference, Castro told McNamara, “we started from the assumption that if there was an invasion of Cuba, nuclear war would erupt. We were certain of that. ... We would be forced to pay the price, that we would disappear.”
This comment is indirectly supported by Ellsberg's discussion of how the Cuban Missile Crisis ended: He said that a Soviet commander on Cuba shot down a US U-2 spy plane killing the pilot, and Castro's anti-aircraft batteries were shooting at other US surveillance aircraft flying over the island. Both these actions directly violated explicit orders from Moscow. Robert Kennedy, the President's brother and Attorney General, told the Soviet ambassador, Dobrynin, "You have drawn first blood ... . [T]he president had decided against advice ... not to respond militarily to that attack, but he [Dobrynin] should know that if another plane was shot at, ... we would take out all the SAMs and antiaircraft ... . And that would almost surely be followed by an invasion.” When Soviet premier Khrushchev heard this, he knew that he had lost control of events in and around Cuba. He immediately accepted the best terms he had received from Washington.
Another event during the Cuban Missile Crisis also almost triggered a nuclear exchange that likely would have killed 98 percent of humanity: On 27 October 1962 a Soviet patrol submarine was being depth charged by the US navy on the surface, unaware that the Soviet sub carried nuclear-tipped torpedos. That submarine could not communicate with Moscow at that time. The commander of that sub and his second-in-command both believed that (a) a nuclear war had probably already begun and (b) their standing orders required them to use their nuclear-tipped torpedoes before surrendering. Fortunately, the commander of the flotilla of subs to which that vessel belonged was on that particular sub. That higher ranking officer disagreed and persuaded the sub commander to surface without using their nuclear-tipped torpedoes.
The point of this discussion is to suggest that the world was incredibly lucky that the Cuban Missile Crisis did not end with at least one nuclear weapon being used.
As to whether the use of one nuclear weapon would trigger a nuclear response, it's useful to recall the comments of McNamara on this: “[I]t is highly unlikely that a nuclear attack on New York City or another American city would end the horror. The urge for revenge in kind would be powerful ... [A] December 2002 Washington Post / ABC News poll revealed that 60 percent of Americans favor nuclear retaliation against Iraq if it were to use either chemical or biological weapons against attacking U.S. forces.
For present purposes, we'll use 50 percent as a plausible estimate of the probability that a crisis analogous to the Cuban Missile Crisis would have ended in a major nuclear war. There is, of course, substantial uncertainty in that number.
Probability that another event like the 1983 Soviet nuclear false alarm incident might lead to Armageddon
A similar analysis of the 1983 Soviet nuclear false alarm incident must consider the reports that Ronald Reagan began his term as US President with substantial saber rattling. This included extensive testing of Soviet defenses by sending military aircraft flying directly toward the USSR, then turning around just before entering Soviet air space. Bruce Blair, an expert on Cold War nuclear strategies and former president of the World Security Institute in Washington, D.C., said, "The Russians (Soviets) saw a US government preparing for a first strike, headed by a President Ronald Reagan capable of ordering a first strike."
In that climate, on September 1, 1983, the Soviets shot down a civilian airliner that had strayed into Soviet airspace, killing all 269 passengers and crew on board. Just over 3 weeks later, on September 26, the equipment in the command center of the Soviet early warning satellites reported the launch of one intercontinental ballistic missile (ICBM) from the US towards the Soviet Union. The system later reported four more ICBMs headed toward the USSR. Lt. Col. Stanislav Petrov, the officer on duty then at this command center, reasoned that a first strike attempt from the US would likely involve hundreds of missiles, not just one or five.
Petrov further noted that he served in that “duty officer” role with others, all of whom were professional soldiers with purely military training. He was the only one with civilian training in addition to his military training. He said if any of the others had been on duty at that time rather than him, they likely would have reported the incident as the launches of actual missiles, with a substantially elevated risk of that incident initiating a major nuclear war.
Oleg Kalugin, a former KGB chief of foreign counter-intelligence who knew Andropov, the Soviet chairman, well, says that Andropov's distrust of American leaders was profound. If Petrov had declared the satellite warnings valid, Andropov may have immediately ordered a counterattack.
For the moment, we'll use 40 percent as the probability that a crisis with essentially equivalent characteristics would produce a nuclear war.
Later, we may ask experts about these and perhaps other nuclear close calls to obtain better estimates of the probabilities than essentially equivalent crises in the future would lead to a nuclear war.
Probability that a major nuclear war might lead to the extinction of civilization
Ellsberg insists that a major nuclear war would almost certainly lead to nuclear winter. In this simple database of only two crises that threatened nuclear war between superpowers with large nuclear arsenals, we will assume for the present that if a nuclear war starts, it will lead to premature deaths of 98 percent of humanity, mostly from the resulting nuclear winter.
It would be useful to include a more substantive review of the literature on nuclear winter, especially if the set of nuclear near misses were expanded to include cases of potential nuclear war between secondary powers that may end without involving countries with larger nuclear arsenals.
The discussion in the previous section suggested 50 and 40 percent as the probabilities that events substantially equivalent to the 1962 Cuban Missile Crisis and the 1983 Soviet false alarm incident would end in a nuclear war and winter. To test the range of sensitivity of this analysis, we will use 45 percent for a typical number with 30 and 60 percent as the ends of an approximate 80 percent confidence interval. Random probabilities like this are typically modeled with a beta distribution, and one with parameters 7.94 and 9.76 have 10 percent of the distribution below 0.3, 10 percent above 0.6 and a mean of 0.45; we used that in simulating the probability that each major nuclear incident would lead to a nuclear Armageddon. If you don't like these assumptions, dear reader, please substitute your own.
We further assume that the time between major nuclear crises follows an exponential distribution. Standard censored data analysis for one observed and one censored time as discussed here leads to an estimate of 57 years as the mean time between nuclear crises using standard maximum likelihood techniques, as outlined in Appendix 1 below. Uncertainty in estimating the time between such crises was modeled using the Bayesian posterior assuming an improper prior locally uniform in the logarithm of the mean time to the next major nuclear crisis. The accompanying normal probability plot shows the results of 10 million simulated times to nuclear Armageddon based on these assumptions. The estimates of 11 and 19 percent as the probability of a nuclear Armageddon in the next 40 or 60 years, respectively, are marked on this plot. This justifies the numbers given in the introduction above.
This analysis could be improved by considering a larger list of potential nuclear crises that might have led to a nuclear Armageddon.
These numbers are sufficient to explain Fermi's paradox: The famous physicist Enrico Fermi was convinced that the available evidence suggested that most stars would seem to have a planet that would support life roughly like our earth. Such planets, Fremi thought, would most likely produce life like on earth in a few million years, including life with sufficient intelligence and physical dexterity to generate radio communications that could be detected on Earth. Yet so far, no such extraterrestrial intelligence has been detected by the Search for extraterrestrial intelligence (SETI). This contradiction led Fermi to asked, “Where is everybody?” This has become known as Fermi's paradox.
The present discussion offers one possible answer to Fermi's question: If any extraterrestrial intelligent beings developed nuclear weapons within a few decades of developing radio, as humans have, then it would likely be at most a few thousand years before such a civilization becomes extinct and stops transmitting coherent signals into outer space -- unless, of course, such beings destroyed all substantive nuclear stockpiles before such stockpiles were actually used -- and managed to prevent their subsequent recreation indefinitely.
Other existential risks
This analysis is not intended to imply that a nuclear Armageddon is the only potential existential risk that could destroy civilization. The Future of Life Institute attempts to focus on the most likely and dangerous of these. Their program covers four key areas:
- AI: Artificial intelligence is progressing so rapidly, that if humans give robots the ability to design and build even more advanced robots, they could decide that humans threaten their future and get rid of us.
- Biotech: Advances in biotech could lead to the design and production of superbugs that are so virulent, they would kill so many people, that it becomes impossible to sustain civilization, as suggested here.
- Nuclear, as discussed here.
- Climate chaos from global warming.
Of course, these could interact. Climate chaos could lead to a nuclear war and winter. Or advanced robots could use biotech as a means of ridding themselves of the threats posed by unfriendly humans.
This article focuses on the nuclear threat for two reasons:
- Daniel Ellsberg has made a very strong case that it's only a question of time before something generates a crises that ends in a nuclear Armageddon as discussed herein, unless we eliminate the world's large nuclear arsenals first.
- This author knows of no comparable documentation suggesting that any other existential threat is as clear and compelling.
How could the situation be this bad with the current status of press freedom in the world?
How could it have been possible to build a doomsday machine like that claimed by Ellsberg (2017) without a more serious public discussion having occurred years ago, at least in the United States? These claims seem to contradict the much vaunted commitment to freedom of the press enshrined in the First Amendment to the United States Constitution. One might naively think that a threat of this magnitude would have long ago generated a substantive public discussion. This is particularly true with the protections to press freedom enforced by the 1964 US Supreme Court decision in New York Times Co. v. Sullivan.
This seeming paradox can be explained in terms of (a) how humans make decisions and (b) how the mainstream media are funded and governed.
How humans make decisions
Daniel Kahneman won the 2002 Nobel Memorial Prize in Economic Sciences for seminal contributions to understanding how humans think and make decisions. Kahneman is a research psychologist, not an economist. He won the economics prize for establishing that the standard economics models of the “rational person” are not how people actually think.
In brief, Kahneman and others have established that people make most decisions intuitively based on what comes most readily to mind. We all have to do this just to get through the day. Kahneman noted that we are capable of more careful thought and deliberation, but we rarely do this, even when we should.
This often leads us to be overconfident. When overconfidence leads us to believe we are better than a certain adversary, this is called comparative optimism. This plausibly afflicted both sides in the Cuban Missile Crisis: Soviet leaders probably underestimated the likely response of the US, and the US underestimated the military strength of the Soviet forces already in Cuba and of the Soviet submarine depth charged by the US Navy. As noted above, the US did not believe the Soviets already had nuclear weapons on Cuba and did not believe the Soviets had nuclear-tipped torpedoes in the submarine they depth charged, and was wrong on both counts. If there had been one more false move by either side, I likely would not be here to write this article today, and you likely would not be here to read it.
Overconfidence is also present in how easily most people accept the mainstream media's designation of enemies of their country without making an effort to try to understand (a) why anyone would oppose them or (b) the magnitude of the threat, and (c) the most effective response.
Business model of media organizations
Every media organization in the world sells changes in audience behaviors to the people who control media funding and governance:
- A media organization without an audience will not have funding for long.
- A media organization that displease its funders, the people who give it money, will not likely survive for long.
In the United States, the mainstream commercial broadcasters get 100 percent of their funding from advertising. The Public Broadcasting Service and most print media get most of their funding from advertising.
Advertisers obviously want to increase sales of their goods or services. They also do not want the public aware of any of their activities that might be questionable.
The US has intervened many times in other countries since President Washington sent US tax money to plantation owners in Haiti to help them suppress a slave rebellion during the French Revolution. In many, and perhaps all, of these interventions, the US government acted to protect US international business interests. Since World War II, many of these actions have been undertaken in secret. Even when there was public debate in the US media, rarely if ever was enough information provided that would allow typical US citizens to understand why any rational person might support their opposition. Media coverage of those perspectives might have offended US international business executives, who controlled advertising budgets.
Meanwhile, in some of these interventions, some journalists and media executives were ignorant of US involvement. In other cases, the media executives and journalists apparently chose to either (a) suppress reports when they could do so with minimal risk of losing audience or (b) spin the stories to support their funders to the extent they could do so while still retaining most of their audience.
What can concerned individuals do about this?
- Organize or join a study group to review Ellsberg's book and the analysis in this article, exploring the implications and relative plausibility of alternative assumptions and analyses.
- Support the International Campaign to Abolish Nuclear Weapons (ICAN). This might include lobbying your public officials to sign and ratify the Treaty on the Prohibition of Nuclear Weapons. In addition to lobbying national figures, ICAN encourages its supporters to lobby lower level public officials as part of their ICAN Cities Appeal.
- If you live in a country with nuclear weapons, especially a large nuclear arsenal, lobby your government to unilaterally destroy its nuclear arsenal. If you live in a country with foreign nuclear weapons on your territory, like Germany to name only one, lobby your government to demand the removal of those nukes and forbid your military from engaging in any activities involving such weapons. The analysis in this article suggest that you are safer without such weapons than with them.
- Seek alternative sources for news. You might find perspectives that may give you a deeper understanding of issues and events than what might be available from mainstream sources. Try to remember your sources for different pieces of information, because that can make it easier for you to learn from comparing superficially conflicting sources.
- Support citizen-directed subsidies for journalism as provided by the US Postal Service Act of 1792, which arguably made a major contribution to making the US what it is today. I don't want either government bureaucrats nor corporate bureaucrats censoring the media I consume.
- Support gradually escalating “national security taxes” on all trade with nations possessing nuclear weapons in proportion to the threat they pose to the future of humanity.
Appendix. Relevant statistical theory and computations
These appendices outline some of the math used in these computations. Part of this is implemented in a Google sheet entitled “Time to extinction of civilization”, which interested readers can copy and modify as they wish to test alternative assumptions.
A complete description of the math and the simulation is provided in the the companion vignette in the Ecfun package for R. You can get this by first installing the free, open-source R (programming language) software. Then in R, use 'install.packages("Ecfun", repos="http://R-Forge.R-project.org")' to get the package. Then use 'help(pac="Ecfun")' to see the package contents. From there, click “User guides, package vignettes and other documentation” to see the list of vignettes. Then click “R code” and “HTML” next to “Ecfun::nuclearArmageddon, Time to nuclear Armageddon” to get the code and an HTML version of the vignette. You can copy the “R code” into a new R file so you can execute the code line by line as you read the HTML version to replicate what you read and test alternatives to evaluate the impact of different approaches to this simulation. This vignette was added in version 0.1-9 of Ecfun, available from R-Forge. As of this writing, this vignette is only available in the development version of this package. Eventually, it will be available in the official released version of this package available via 'install.packages(“Ecfun”)'. However, as this is being written, the current released version is 0.1-7, which does not have this vignette.
Appendix 1 in the present document describes the standard method for estimating mean time between events like this. Appendix 2, below, outlines some of the math we used to describe the probability distribution of the time to extinction of civilization that results from the assumptions described in this document. More detail including confidence intervals and simulated times are described in the vignette with the Ecfun package for R, just mentioned.
Appendix 1. Estimating the mean time between nuclear crises
This appendix will document the estimation of the mean time between nuclear crises. For this we assume an exponential distribution. This is the simplest lifetime distribution. It basically assumes that the time to the next crisis is independent of the time since the most recent previous one, which seems reasonable for this application.
For the exponential distribution with mean , the probability that the random time to the next event, , exceeds t is = , and the density is . Then the likelihood for observing one lifetime of 21 years and another censored at 36 years is the product of and :
Here, “life” obviously refers to the time to the next incident that poses a major threat of a nuclear war leading to the extinction of civilization. This is maximized with = 57 years. The standard deviation for the exponential distribution is equal to the mean. The fact that we also have a censored observation reduces the uncertainty in this estimate, but we also don't know that the distribution is exponential. Standard statistical theory can be used to estimate confidence intervals for . A future version of this article will include that.
A spreadsheet including these computations is available as a Google sheet entitled “Time to extinction of civilization”.
More generally, for future reference, suppose we have observed times, , with the first time being observed and the last one being a censoring time, then the likelihood is as follows:
Then the log(likelihood) is as follows:
Or in terms of , this is as follows:
Then the score function, being the first derivative of the log(likelihood), is as follows:
This is 0 where:
This gives us the Maximum Likelihood Estimate (MLE) for both and . The MLE is typically written with a circumflex (^), so for the current data set of one observed and one censored time, we write = 57. Confidence intervals for are obtained in the companion vignette in the Ecfun package for R, mentioned above. That vignette also derives an inverse-gamma distribution as a Bayesian posterior distribution for assuming an improper prior locally uniform in . This posterior is one of the key components of the Monte Carlo simulation of 10 million simulated times to extinction summarized in the figure above.
Appendix 2. Probability distribution of the time to extinction of civilization
For the purposes of this analysis, we assume that humans are not able to destroy the world's large nuclear arsenals before they destroy us, as suggested by our explanation of Fermi's Paradox, above. We further assume that the other aspects of this analysis are adequately modeled by the data we consider and our interpretation thereof.
It is known that the exponential distribution is a special case of the gamma distribution with shape parameter = 1. Conveniently, if civilization survives major nuclear crises like those described above and is extinguished on the th, then the expected time to extinction is the sum of exponentially distributed random variables, assumed to all have the same mean lifetime , then the total time to extinction follows a gamma distribution with shape and mean .
However, in this case, is a random variable, so we'll write that as , because random variables are typically denoted using capital letters. This further means that total time to the extinction of civilization follows a mixture of gamma distributions with the same scale parameter, . And the expected time to extinction of civilization starting now is .
Let be the probability that a major crisis such as the two considered here escalates to a major nuclear war and the extinction of civilization, with a typical value of 45 percent and bounds estimated above at 30 and 60 percent. In the companion vignette in the Ecfun package for R, mentioned above, we find that a beta distribution with parameters 7.94 and 9.76 has a probability of 0.8 that will lie between 30 and 60 percent with a mean value of 0.449 and probability 0.1 in each of the lower and upper tails.
This beta distribution is used with the inverse-gamma distribution for described in Appendix 1 above to generate the Monte Carlo simulation of 10 million simulated times to extinction summarized in the figure above. Details appear in the companion vignette in the Ecfun package for R, mentioned above.
- Nuclear weapons and effective defense
- Effective defense
- Effective defense and ISIL
- Great American Paradox
- Ellsberg, Daniel (2017), The Doomsday Machine: Confessions of a Nuclear War Planner, Bloomsbury USA, ISBN 9781608196708, https://www.bloomsbury.com/us/the-doomsday-machine-9781608196708/
- Ellsberg, Daniel; Goodman, Amy; González, Juan (2017-12-06), Daniel Ellsberg Reveals He was a Nuclear War Planner, Warns of Nuclear Winter & Global Starvation, Democracy Now, https://www.democracynow.org/2017/12/6/doomsday_machine_daniel_ellsberg_reveals_he .
- McNamara, Robert; James G. (2003), Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century, Public Affairs, ISBN 1-58648-143-6
- McNamara and Blight (2003)
- Ellsberg (2017)
- Ellsberg, Goodman and González (2017)
- Sagan, Scott D. (1993). The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton U. Pr. p. 279. ISBN 0-691-02101-5.
- Any such alternative list would need to be screened to eliminate any so-called “Broken Arrow” event, which by definition is “an accidental event that involves nuclear weapons, warheads or components that does not create a risk of nuclear war.” Of course, any particular event for which some experts believe might create a risk of war leading to a nuclear Armageddon could be included with an estimate of the probability of such escalation being appropriately low to reflect the range of expert opinion about that particular event.
- McNamara and Blight (2003, pp. 189-190).
- Ellsberg (2017, p. 206).
- McNamara and Blight (2003, pp. 254-255).
- "War Games". Dateline NBC. Burrelle's Information Services. 12 November 2000.
- (26 September 2013) Stanislav Petrov: The man who may have saved the world BBC News Europe. Retrieved 26 September 2013
- Shane, Scott (31 August 2003). "Cold War's Riskiest Moment". Baltimore Sun. Archived from the original on 19 August 2006. Retrieved 20 August 2006. (article reprinted as "The Nuclear War That Almost Happened in 1983"')
- Ellsberg (2017). Ellsberg, Goodman, and González (2017).
- For details of this see the companion vignette in the Ecfun package for R. You can get this by first installing the free, open-source R (programming language) software. Then in R, use ' install.packages("Ecfun", repos="http://R-Forge.R-project.org")' to get the package containing this vignette. Then use 'help(pac="Ecfun")' to see the package contents, then click “User guides, package vignettes and other documentation” to see the list of vignettes. Then click “R code” and “HTML” next to “Ecfun::nuclearArmageddon, Time to nuclear Armageddon” to get the code and an HTML version of the vignette. You can copy the “R code” into a new R file you create so you can execute the code line by line as you read the HTML version to replicate what you read and test alternatives to evaluate the impact of different approaches to this simulation. This vignette was added in version 0.1-9 of Ecfun, available from R-Forge. If you use simply 'install.packages(“Ecfun”)' you may get version 0.1-7, which does not have this vignette.
- A nuclear Armageddon like this is only one of four “existential risks” facing humanity today considered by the Future of Life Institute (FLI). The other three are the rapid increase in artificial intelligence (AI), biotech, and climate chaos. Regarding AI, for example, they worry that if computers ever obtain the capability to design and create their next generation, they may ultimately decide that humans pose a threat to their future and eliminate that risk. In addition to eliminating the world's large nuclear arsenals, we must also act aggressively to control other risks like these as well if we are to avoid succumbing to this answer to "Fermi's paradox."
- Daniel Kahneman (25 October 2011), Thinking, Fast and Slow, Farrar, Straus and Giroux, ISBN 978-0-374-27563-1 , Wikidata Q983718
- Regarding Haiti, then called Saint-Domingue, see w:George Washington and slavery, Foreign interventions by the United States, Timeline of United States military operations, and United States involvement in regime change.
- Ellsberg (2017); see also Ellsberg, Goodman, and González (2017).
- The Wikiversity article on the Great American Paradox claims that the US Postal Service Act made a much bigger contribution to making the US what it is today than the violence of the American Revolution.