Time to extinction of civilization

From Wikiversity
Jump to navigation Jump to search
This essay is on Wikiversity to encourage a wide discussion of the issues it raises moderated by the Wikimedia rules that invite contributors to “be bold but not reckless,” contributing revisions written from a neutral point of view, citing credible sources -- and raising other questions and concerns on the associated '“Discuss”' page.
Leaders in the 1963 Cuban Missile Crisis
Khrushchev, Soviet head of state, 1953-1964 Kennedy, US President, 1961-63
Castro, Cuban head of state, 1959-2008 McNamara, US Secretary of Defense, 1961-68

Robert McNamara[1] and Daniel Ellsberg[2] both stated that as long as large nuclear arsenals exist, it is only a matter of time before a nuclear incident generates runaway demands for vengeance, which will likely end only after nearly all the nuclear stockpiles are exhausted. McNamara was Secretary of Defense during the 1962 Cuban Missile Crisis, and Ellsberg was an expert in nuclear war strategy advising McNamara at that time. Both have researched and commented on this since then.

This article presents a methodology for estimating the probability distribution of the time to extinction of civilization.

In brief, this analysis estimates a probability of roughly 11 percent that sometime in the next 40 years, within the expected lives of roughly half of the people alive today, a nuclear war will occur that will lead to the deaths of 98 percent of humanity and destroy civilization in the process -- a nuclear Armageddon. Similarly, the probability of such an event in the next 60 years (before the end of the twenty-first century as this is being written), is roughly 19 percent.
This risk will be essentially eliminated if the world's large nuclear arsenals are destroyed first.

These estimates are obtained by straightforward application of standard methods for censored data analysis of the time between events, with one observed time of 21 years between the 1962 Cuban Missile Crisis and the 1983 Soviet nuclear false alarm incident, and with the time to the next such event being longer than (censored at) the 36 years between 1983 and 2019 as this is being written, and assuming a probability averaging 45 percent that a comparable incident in the future would produce a nuclear Armageddon.

These estimates may be conservative, both because they do not consider other known incidents that could have triggered such a nuclear war and winter and because it is extremely difficult to accurately estimate the probability that a future nuclear incident might produce a nuclear war and winter. In fact, Beard and Holt (2019) at the Centre for the Study of Existential Risk at the University of Cambridge estimated a 0.6 percent chance of a nuclear war in the next year. As noted in the section on this below, that is roughly equivalent to estimates of 21 and 30 percent of a nuclear war in the next 40 or 60 years, respectively.

There may be other nuclear incidents that should be included in an analysis like this but are not publicly known. If the Soviet Union had not collapsed, the exitance of the 1983 Soviet nuclear false alarm incident would likely still be secret, known only to a few in the Soviet government. There may well have been similar incidents in the US, the People's Republic of China, the United Kingdom, France, India, Pakistan, Israel, and / or North Korea that are still secret. If so, the estimates discussed here could be quite conservative.

The uncertainty in these estimates is substantial. However, even considering this uncertainty, this analysis strongly suggests that

the greatest threat to the national security of each of the major nuclear powers and to the future of humanity more generally is the existing nuclear arsenals,

as implied by the publications of McNamara, Ellsberg, and others.[3]

The entire methodology is described herein. Readers are invited to conduct any modification they feel appropriate and report the results with this article or in a companion article linked to this one or on the companion “Discuss” page, preferably citing sources for improvements on the analysis described here.

Background[edit | edit source]

Daniel Ellsberg

This article discusses the threat posed by large nuclear arsenals. In this effort, we describe a methodology for estimating the probability distribution of the time to extinction of civilization.

Ellsberg (2017) said he was a nuclear war planner from 1958 to 1971 for US presidents Eisenhower, Kennedy, Johnson, and Nixon. In 1961 early in the administration of John Kennedy, the new president (at Ellsberg's suggestion) asked his military how many people in Russia and China would die in a nuclear war. Answer: 275 million in the initial exchange, with another 50 million dying of radiation poisoning over the next six months. Kennedy then asked (again at Ellsberg's suggestion) about Eastern Europe, then Western Europe, then the US. Answer: In a nuclear war, a third of humanity would die, most in the initial exchange and the rest from radiation poisoning over the next six months.[4]

In 1955 President Eisenhower told the members of his National Security Council that he thought the US should develop a few nuclear-tipped missiles "as a threat, but not 1000 or more ... if the Russians can fire 1000 a day at us and we can fire 1000 a day at them, then he personally would want to take off for the Argentine." A few weeks earlier he had expressed the same view to Soviet Premier Bulganin at the summit meeting in Geneva: “He said that the development of modern weapons was such that a country which used them genuinely risked destroying itself. Since the prevailing winds went east to west and not north to south a major war would destroy the Northern Hemisphere and he had no desire to leave all life and civilization to the Southern Hemisphere,"' according to Matthew Evangelista, who continued, "The Soviet leaders made comparable statements about the suicidal consequences of using nuclear weapons."[5]

In 1957 Mao Zedong, the leader of the People's Republic of China suggests he thought a third could be conservative: “if war breaks out ... a third could be lost. If it is a little higher, it could be half ... but imperialism would be razed to the ground and the whole world would become socialist. After a few years” the world's population would be restored.[6]

That does not consider nuclear winter, which was not seriously discussed until the 1980s. Ellsberg said that a major nuclear war would almost certainly loft so much soot into the stratosphere that it would produce a nuclear winter lasting several years during which 98 percent of humanity would starve to death if they did not die of something else sooner. This would result from massive quantities of smoke from firestorms reaching the stratosphere. It would remain there for years, because it's very light, and clouds rarely form there. This would reduce the production of vegetation so much that most animals, including humans, would starve.[7]

Normal or system accidents[edit | edit source]

In The Limits of Safety, Scott Sagan documented numerous incidents in the post World War II period that have been described as posing a high risk of nuclear war by accident or miscalculation. Sagan claimed that these incidents reflect the virtual impossibility of operating any sufficiently complex system to ultra-high levels of reliability, a phenomenon known as “normal accidents" or "system accidents".[8]

The basic problem was described by a college classmate, who came to school with one hand all bandaged up. He said he had defeated the safety interlock on his Cuisinart.

This silly story seems similar to the miscalculations that led the Soviet Union to put nuclear weapons on Cuba and led the Kennedy administration to depth charge a Soviet submarine and prepare to invade Cuba on the mistaken belief that they would not be facing nuclear retaliation in doing so, as discussed further below. It also seems similar to what might have happened if the 1983 Soviet nuclear false alarm had been reported as a real attack to the Kremlin, also discussed further below.

Time between nuclear crises[edit | edit source]

For the purpose of the present analysis, we assume that the world has seen two nuclear crises that almost resulted in a nuclear war like just outlined: The Cuban Missile Crisis of 1962 (October 16-28) and the September 26, 1983, Soviet nuclear false alarm incident.

We also assume that the distribution of the time between such incidents is constant.

Given the documentation of normal or systems accidents, any other assumption would seem questionable.

Some other list of nuclear incidents could be used. For example, as of 2019-01-29, the Wikipedia article on “List of nuclear close calls” described 10 incidents, being the Cuban Missile Crisis, the 1983 Soviet nuclear false alarm incident, and eight others.[9]

We also assume that all such major incidents are known, which seems foolish: The 1983 Soviet nuclear false alarm incident only became public knowledge several years after the collapse of the Soviet Union. Similar incidents could have occurred in the militaries of the US, the People's Republic of China, France, the United Kingdom, India and / or Pakistan and could still be state secrets with no public awareness of a nuclear disaster that almost occurred. A future update to this article may also try to model the time from each such crisis to publication of information about it.

Standard data analysis of an observed time of 21 years and another time that's longer than (censored at) 36 years produces an estimate of mean time between such crises of 57 years, as outlined in Appendix 1 below.

Probability that a nuclear crisis would generate a major nuclear war[edit | edit source]

We want to build this into an estimate of the probability distribution of the time to extinction of human civilization. To do this, we combined the (random) time to the next nuclear crisis with a random variable indicating whether that next nuclear crisis would escalate into the extinction of civilization.

Probability that another event like the Cuban Missile Crisis might lead to Armageddon[edit | edit source]

To estimate the probability that another event like the Cuban Missile Crisis might lead to a nuclear war, it is useful to consider McNamara's report of his conversation with Fidel Castro in 1992 during a conference on the thirtieth anniversary of the Cuban Missile Crisis. In October 1962 there were 162 Soviet nuclear warheads in Cuba, which the US government did not believe were there. At that 1992 conference, Castro told McNamara, “we started from the assumption that if there was an invasion of Cuba, nuclear war would erupt. We were certain of that. ... We would be forced to pay the price, that we would disappear.”[10]

This comment is indirectly supported by Ellsberg's discussion of how the Cuban Missile Crisis ended: He said that a Soviet commander on Cuba shot down a US U-2 spy plane killing the pilot, and Castro's anti-aircraft batteries were shooting at other US surveillance aircraft flying over the island. Both these actions directly violated explicit orders from Moscow. Robert Kennedy, the President's brother and Attorney General, told the Soviet ambassador, Dobrynin, "You have drawn first blood ... . [T]he president had decided against advice ... not to respond militarily to that attack, but he [Dobrynin] should know that if another plane was shot at, ... we would take out all the SAMs and antiaircraft ... . And that would almost surely be followed by an invasion.”[11] When Soviet premier Khrushchev heard this, he knew that he had lost control of events in and around Cuba. He immediately accepted the best terms he had received from Washington.

Another event during the Cuban Missile Crisis also almost triggered a nuclear exchange that likely would have killed 98 percent of humanity: On 27 October 1962 a Soviet patrol submarine was being depth charged by the US navy on the surface, unaware that the Soviet sub carried nuclear-tipped torpedos. That submarine could not communicate with Moscow at that time. The commander of that sub and his second-in-command both believed that (a) a nuclear war had probably already begun and (b) their standing orders required them to use their nuclear-tipped torpedoes before surrendering. Fortunately, the commander of the flotilla of subs to which that vessel belonged was on that particular sub. That higher ranking officer disagreed and persuaded the sub commander to surface without using their nuclear-tipped torpedoes.

The point of this discussion is to suggest that the world was incredibly lucky that the Cuban Missile Crisis did not end with at least one nuclear weapon being used.

As to whether the use of one nuclear weapon would trigger a nuclear response, it's useful to recall the comments of McNamara on this: “[I]t is highly unlikely that a nuclear attack on New York City or another American city would end the horror. The urge for revenge in kind would be powerful ... [A] December 2002 Washington Post / ABC News poll revealed that 60 percent of Americans favor nuclear retaliation against Iraq if it were to use either chemical or biological weapons against attacking U.S. forces."[12]

For present purposes, we'll use 50 percent as a plausible estimate of the probability that a crisis analogous to the Cuban Missile Crisis would have ended in a major nuclear war. There is, of course, substantial uncertainty in that number.

Probability that another event like the 1983 Soviet nuclear false alarm incident might lead to Armageddon[edit | edit source]

A similar analysis of the 1983 Soviet nuclear false alarm incident must consider the reports that Ronald Reagan began his term as US President with substantial saber rattling. This included extensive testing of Soviet defenses by sending military aircraft flying directly toward the USSR, then turning around just before entering Soviet air space. Bruce Blair, an expert on Cold War nuclear strategies and former president of the World Security Institute in Washington, D.C., said, "The Russians (Soviets) saw a US government preparing for a first strike, headed by a President Ronald Reagan capable of ordering a first strike."[13] Other sources concur that Soviet paranoia at that time could have led to a Soviet nuclear first strike, in the mistaken belief that they were only reacting to a first strike by the US already in progress.[14]

In that climate, on September 1, 1983, the Soviets shot down a civilian airliner that had strayed into Soviet airspace, killing all 269 passengers and crew on board. Just over 3 weeks later, on September 26, the equipment in the command center of the Soviet early warning satellites reported the launch of one intercontinental ballistic missile (ICBM) from the US towards the Soviet Union. The system later reported four more ICBMs headed toward the USSR. Lt. Col. Stanislav Petrov, the officer on duty then at this command center, reasoned that a first strike attempt from the US would likely involve hundreds of missiles, not just one or five.

Petrov further noted that he served in that “duty officer” role with others, all of whom were professional soldiers with purely military training. He was the only one with civilian training in addition to his military training. He said if any of the others had been on duty at that time rather than him, they likely would have reported the incident as the launches of actual missiles, with a substantially elevated risk of that incident initiating a major nuclear war.[15]

For the moment, we'll use 40 percent as the probability that a crisis with essentially equivalent characteristics would produce a nuclear war.

Later, we may ask experts about these and perhaps other nuclear close calls to obtain better estimates of the probabilities than essentially equivalent crises in the future would lead to a nuclear war.

Forcing the military to implement McNamara's "Permissive Action Links"[edit | edit source]

In 2002 Bruce Blair said he had told former U.S. Secretary of Defense Robert McNamara (1961-1968) the previous month that the secret codes (called "Permissive Action Links”) required to launch Minuteman missiles had all been set to OOOOOOOO. McNamara was shocked, because the top military leaders had assured him that those secret codes had been installed. In fact, the hardware had been installed. However, the secret codes had all been set to OOOOOOOO. Blair knew this, because one of his jobs while in the U.S. Air Force 1970 to 1974 had been as a Minuteman ICBM launch control officer. After he left the military, he began lobbying first the Department of Defense and then the U.S. Congress to change those codes to something different. They were officially "activated" in 1977. In discussing this, Blair concluded, "It is hard to know where to begin, and end, in recounting stories like this one that reveal how misinformed, misled, and misguided on critical nuclear matters our top leaders have been throughout the nuclear age."[16] For more, see Blair's other publications.[17]

Probability that a major nuclear war might lead to the extinction of civilization[edit | edit source]

Ellsberg insists that a major nuclear war would almost certainly lead to nuclear winter.[18] In this simple database of only two crises that threatened nuclear war between superpowers with large nuclear arsenals, we will assume for the present that if a nuclear war starts, it will lead to premature deaths of 98 percent of humanity, mostly from the resulting nuclear winter.

It would be useful to include a more substantive review of the literature on nuclear winter, especially if the set of nuclear near misses were expanded to include cases of potential nuclear war between secondary powers that may end without involving countries with larger nuclear arsenals.

The discussion in the previous section suggested 50 and 40 percent as the probabilities that events substantially equivalent to the 1962 Cuban Missile Crisis and the 1983 Soviet false alarm incident would end in a nuclear war and winter. To test the range of sensitivity of this analysis, we will use 45 percent for a typical number with 30 and 60 percent as the ends of an approximate 80 percent confidence interval. Random probabilities like this are typically modeled with a beta distribution, and one with parameters 7.94 and 9.76 have 10 percent of the distribution below 0.3, 10 percent above 0.6 and a mean of 0.45; we used that in simulating the probability that each major nuclear incident would lead to a nuclear Armageddon. If you don't like these assumptions, dear reader, please substitute your own.

Normal probability plot of simulated years to nuclear Armageddon. (Right axis is probability. Left axis is normal scores.)

We further assume that the time between major nuclear crises follows an exponential distribution. Standard censored data analysis for one observed and one censored time as discussed here leads to an estimate of 57 years as the mean time between nuclear crises using standard maximum likelihood techniques, as outlined in Appendix 1 below. Uncertainty in estimating the time between such crises was modeled using the Bayesian posterior assuming an improper prior locally uniform in the logarithm of the mean time to the next major nuclear crisis. The accompanying normal probability plot shows the results of 10 million simulated times to nuclear Armageddon based on these assumptions. The estimates of 11 and 19 percent as the probability of a nuclear Armageddon in the next 40 or 60 years, respectively, are marked on this plot. This justifies the numbers given in the introduction above.[19]

This analysis could be improved by considering a larger list of potential nuclear crises that might have led to a nuclear Armageddon.

Estimate by the Centre for the Study of Existential Risk[edit | edit source]

Beard and Holt (2019) with the Centre for the Study of Existential Risk estimated a 0.6 percent chance of a nuclear war within the next year. Assuming an exponential distribution, which seems both fairly standard and reasonable for the purposes of this present article, that translates into a mean time to such a nuclear war of 166.67 years. This further translates into probabilities of 21 percent of a nuclear war in the next 40 years and 30 percent in the next 60 years. This completely independent analysis suggests that the analysis described above is conservative.

Fermi's paradox[edit | edit source]

These numbers are sufficient to explain Fermi's paradox: The famous physicist Enrico Fermi was convinced that the available evidence suggested that most stars would seem to have a planet that would support life roughly like our earth. Such planets, Fremi thought, would most likely produce life like on earth in a few million years, including life with sufficient intelligence and physical dexterity to generate radio communications that could be detected on Earth. Yet so far, no such extraterrestrial intelligence has been detected by the Search for extraterrestrial intelligence (SETI). This contradiction led Fermi to asked, “Where is everybody?” This has become known as Fermi's paradox.

The present discussion offers one possible answer to Fermi's question: If any extraterrestrial intelligent beings developed nuclear weapons within a few decades of developing radio, as humans have, then it would likely be at most a few thousand years before such a civilization becomes extinct and stops transmitting coherent signals into outer space -- unless, of course, such beings destroyed all substantive nuclear stockpiles before such stockpiles were actually used -- and managed to prevent their subsequent recreation indefinitely.[20]

Other existential risks[edit | edit source]

This analysis is not intended to imply that a nuclear Armageddon is the only potential existential risk that could destroy civilization. The Future of Life Institute attempts to focus on the most likely and dangerous of these. Their program covers four key areas:

  • AI: Artificial intelligence is improving so rapidly, that if humans ever give robots the ability to design and build even more advanced robots, the robots could decide that humans threaten their future and get rid of us.
  • Biotech: Advances in biotech could lead to the design and production of superbugs that are so virulent, they would kill so many people, that it becomes impossible to sustain civilization, as suggested here.
  • Nuclear, as discussed here.
  • Climate chaos from global warming.

Of course, these could interact. Climate chaos could lead to a nuclear war and winter. Or advanced robots could use biotech as a means of ridding themselves of the threats posed by unfriendly humans.

This article focuses on the nuclear threat for two reasons:

  1. Both Robert McNamara and Daniel Ellsberg have published detailed analyses insisting that a major nuclear war is inevitable unless the world's large nuclear arsenals are destroyed first. Ellsberg insists further that a nuclear winter will almost certainly follow such a major nuclear war.
  2. This author knows of no comparable documentation suggesting that any other existential threat is as likely to end civilization as a nuclear Armageddon.

How could the situation be this bad with the current status of press freedom in the world?[edit | edit source]

How could it have been possible to build a doomsday machine like that claimed by Ellsberg (2017) without a more serious public discussion having occurred years ago, at least in the United States? These claims seem to contradict the much vaunted commitment to freedom of the press enshrined in the First Amendment to the United States Constitution. One might naively think that a threat of this magnitude would have long ago generated a substantive public discussion. This is particularly true with the protections to press freedom enforced by the 1964 US Supreme Court decision in New York Times Co. v. Sullivan.

This seeming paradox can be explained in terms of how (a) humans make decisions and (b) the mainstream media are funded and governed.

How humans make decisions[edit | edit source]

Daniel Kahneman

Daniel Kahneman won the 2002 Nobel Memorial Prize in Economic Sciences for seminal contributions to understanding how humans think and make decisions. Kahneman is a research psychologist, not an economist. He won the economics prize for establishing that the standard economics models of the “rational person” are not how people actually think.

In brief, Kahneman and others have established that people make most decisions intuitively based on what comes most readily to mind. This is called "fast thinking" in Kahneman (2011) Thinking, Fast and Slow. We all have to do this just to get through the day. Kahneman noted that we are capable of more careful thought and deliberation, but we are not very good at using this "slow thinking" when we should.[21]

This often leads us to be overconfident. When overconfidence leads us to believe we are better than a certain adversary, this is called comparative optimism. This plausibly afflicted both sides in the Cuban Missile Crisis: Soviet leaders probably underestimated the likely response of the US, and the US underestimated the military strength of the Soviet forces already in Cuba and of the Soviet submarine depth charged by the US Navy. As noted above, the US did not believe the Soviets already had nuclear weapons on Cuba and did not believe the Soviets had nuclear-tipped torpedoes in the submarine they depth charged, and was wrong on both counts. If there had been one more false move by either side, I likely would not be here to write this article today, and you likely would not be here to read it.

Overconfidence is also present in how easily most people accept the mainstream media's designation of enemies of their country without making an effort to try to understand (a) why anyone would oppose them or (b) the magnitude of the threat, and (c) the most effective response.

Business model of media organizations[edit | edit source]

Every media organization in the world sells changes in audience behaviors to the people who control media funding and governance:

  • A media organization without an audience will not have funding for long.
  • A media organization that displease its funders, the people who give it money, will not likely survive for long.

In the United States, the mainstream commercial broadcasters get 100 percent of their funding from advertising. The Public Broadcasting Service and most print media get most of their funding from advertising.

Advertisers want to increase sales of their goods or services. They also do not want the public aware of any of their own activities that might be questionable.

The US has intervened many times in other countries since President Washington sent US tax money to plantation owners in Haiti to help them suppress a slave rebellion during the French Revolution.[22] In many, and perhaps all, of these interventions, the US government acted to protect US international business interests.

Since World War II, many of these actions have been undertaken in secret. Even when there was public debate in the US media, rarely if ever was enough information provided that would allow typical US citizens to understand why any rational person might support their opposition. Media coverage of those perspectives might have offended US international business executives, who controlled advertising budgets.

Meanwhile, in some of these interventions, some journalists and media executives were ignorant of US involvement. In other cases, the media executives and journalists apparently chose to either (a) suppress reports when they could do so with minimal risk of losing audience or (b) spin the stories to support their funders to the extent they could do so while still retaining most of their audience.

This behavior of journalists and media executives is made easier by confirmation bias, which is the natural human tendency to prefer information and sources that reinforce our preconceptions.

Wikipedia, theoretically, offers a solution to media bias, as it remains a self funded and ad-free organization.

What can concerned individuals do about this?[edit | edit source]

If you believe that Daniel Ellsberg may be correct in the analysis he published in his Doomsday Machine,[23] then you may wish to pursue some or all of the following:

  1. Organize or join a study group to review Ellsberg's book and the analysis in this article, exploring the implications and relative plausibility of alternative assumptions and analyses.
  2. Support the International Campaign to Abolish Nuclear Weapons (ICAN). This might include lobbying your public officials to sign and ratify the Treaty on the Prohibition of Nuclear Weapons. In addition to lobbying national figures, ICAN encourages its supporters to lobby lower level public officials as part of their ICAN Cities Appeal.
  3. Support other international efforts to develop a regime of nonnuclear deterrence to the use of nuclear weapons. This could include sanctions that would automatically take effect whenever evidence appears that nuclear weapons may have been used. Such nonnuclear sanctions could make it easier for nuclear states to become nonnuclear, as South Africa did before the end of Apartheid.
  4. If you live in a country with nuclear weapons, especially a large nuclear arsenal, lobby your government to unilaterally destroy its nuclear arsenal and urge and negotiate that the global community to follow suit. If you live in a country with foreign nuclear weapons on your territory, like Germany to name only one, lobby your government to demand the removal of those nukes and forbid your military from engaging in any activities involving such weapons. The analysis in this article suggest that you are safer without such weapons than with them.
  5. Seek alternative sources for news. You might find perspectives that may give you a deeper understanding of issues and events than what might be available from mainstream sources. Try to remember your sources for different pieces of information, because that can make it easier for you to learn from comparing superficially conflicting sources.
  6. Support citizen-directed subsidies for journalism as provided by the US Postal Service Act of 1792, which arguably made a major contribution to making the US what it is today.[24] I don't want either government bureaucrats nor corporate bureaucrats censoring the media I consume.
  7. Support gradually escalating “national security taxes” on all trade with nations possessing nuclear weapons in proportion to the threat they pose to the future of humanity.

Appendix. Relevant statistical theory and computations[edit | edit source]

These appendices outline some of the math used in these computations. Part of this is implemented in a Google sheet entitled “Time to extinction of civilization”, which interested readers can copy and modify as they wish to test alternative assumptions.

A complete description of the math and the simulation is provided in the the companion vignette in the Spencer Graves (4 February 2020), Ecfun: Functions for Ecdat, Wikidata Q56452538 package for R. You can get this by first installing the free, open-source R (programming language) software. Then in R, use 'install.packages("Ecfun")' to get the package. Then use 'help(pac="Ecfun")' to see the package contents. From there, click “User guides, package vignettes and other documentation” to see the list of vignettes. Then click “R code” and “HTML” next to “Ecfun::nuclearArmageddon, Time to nuclear Armageddon” to get the code and an HTML version of the vignette. You can copy the “R code” into a new R file so you can execute the code line by line as you read the HTML version to replicate what you read and test alternatives to evaluate the impact of different approaches to this simulation.

Appendix 1 in the present document describes the standard method for estimating mean time between events like this. Appendix 2, below, outlines some of the math used to describe the probability distribution of the time to extinction of civilization that results from the assumptions described in this document. More detail including confidence intervals and simulated times are described in the vignette with the Ecfun package for R, just mentioned.

Appendix 1. Estimating the mean time between nuclear crises[edit | edit source]

This appendix will document the estimation of the mean time between nuclear crises. For this we assume an exponential distribution. This is the simplest lifetime distribution. It basically assumes that the time to the next crisis is independent of the time since the most recent previous one, which seems reasonable for this application.

For the exponential distribution with mean , the probability that the random time to the next event, , exceeds t is = , and the density is . Then the likelihood for observing one lifetime of 21 years and another censored at 36 years is the product of and :

.

Here, “life” obviously refers to the time to the next incident that poses a major threat of a nuclear war leading to the extinction of civilization. This is maximized with = 57 years. See the vignette in the Ecfun package for R mentioned above for an analysis of the uncertainty in this estimate of .

A spreadsheet including these computations is available as a Google sheet entitled “Time to extinction of civilization”.

More generally, for future reference, suppose we have observed times, , with the first time being observed and the last one being a censoring time. Then the likelihood is as follows:

,

where

.

Then the log(likelihood) is as follows:

Or in terms of , this is as follows:

.

Then the score function, being the first derivative of the log(likelihood), is as follows:

.

This is monotonically decreasing in , which means that the likelihood will be maximized where this is derivative is :

.

This gives us the Maximum Likelihood Estimate (MLE) for both and . The MLE is typically written with a circumflex (^), so for the current data set of one observed and one censored time, we write = 57. Confidence intervals for are obtained in the companion vignette in the Ecfun package for R, mentioned above. That vignette also derives an inverse-gamma distribution as a Bayesian posterior distribution for assuming an improper prior locally uniform in . This posterior is one of the key components of the Monte Carlo simulation of 10 million simulated times to extinction summarized in the figure above.

Appendix 2. Probability distribution of the time to extinction of civilization[edit | edit source]

For the purposes of this analysis, we assume that humans are not able to destroy the world's large nuclear arsenals before they destroy us, as suggested by our explanation of Fermi's Paradox, above. We further assume that the other aspects of this analysis are adequately modeled by the data we consider and our interpretation thereof.

It is known that the exponential distribution is a special case of the gamma distribution with shape parameter = 1. Conveniently, if civilization survives major nuclear crises like those described above and is extinguished on the th, then the expected time to extinction is the sum of exponentially distributed random variables, assumed to all have the same mean lifetime , then the total time to extinction follows a gamma distribution with shape and mean .

However, in this case, is a random variable, so we'll write that as , because random variables are typically denoted using capital letters. This further means that total time to the extinction of civilization follows a mixture of gamma distributions with the same scale parameter, . And the expected time to extinction of civilization starting now is .

Let be the probability that a major crisis such as the two considered here escalates to a major nuclear war and the extinction of civilization, with a typical value of 45 percent and bounds as estimated above at 30 and 60 percent. In the companion vignette in the Ecfun package for R, mentioned above, we find that a beta distribution with parameters 7.94 and 9.76 has a probability of 0.8 that will lie between 30 and 60 percent with a mean value of 0.449 and probability 0.1 in each of the lower and upper tails.

This beta distribution is used with the inverse-gamma distribution for described in Appendix 1 above to generate the Monte Carlo simulation of 10 million simulated times to extinction summarized in the figure above. Details appear in the companion vignette in the Ecfun package for R, mentioned above.

See also[edit | edit source]

References[edit | edit source]

  • Robert McNamara; James G. Blight (2003). Wilson's ghost: reducing the risk of conflict, killing, and catastrophe in the 21st century (in en). PublicAffairs. Wikidata Q64736611. ISBN 1-58648-143-6. 

Notes[edit | edit source]

  1. McNamara and Blight (2003)
  2. Ellsberg (2017)
  3. See, e.g., William Perry; Tom Z. Collina (June 2020), The Button: The new nuclear arms race and presidential power from Truman to Trump, BenBella Books, Wikidata Q102046116. See also Ira Helfand (November 2013), Nuclear famine: two billion people at risk? (2nd ed.), Wikidata Q63256454. This could further spell the end of civilization, according to Shaun Tandon (10 December 2013), Une guerre nucléaire provoquerait-elle la fin de la civilisation ? (in French), Agence France-Presse, Wikidata Q106659147. See also Paul Quilès; Jean-Marie Collin; Michel Drain (February 2019), L'illusion Nucléare: La face cachée de la bombe atomique (in French), Éditions Charles Léopold Mayer, Wikidata Q106657529.
  4. Ellsberg, Goodman and González (2017)
  5. Matthew Evangelista (December 1997). "“Why Keep Such an Army?” Khrushchev’s Troop Reductions". Cold War International History Project 19. Wikidata Q86245167. https://www.wilsoncenter.org/sites/default/files/media/documents/publication/ACFB43.pdf. , pp. 38-39. See also Noam Chomsky (May 2016), Who Rules the World?, Picador, OL 20029949W, Wikidata Q86247233, p. 101.
  6. Dikötter, Frank (2010). Mao's Great Famine: The History of China's Most Devastating Catastrophe, 1958–62. London: Walker & Company. p. 13. ISBN 978-0-8027-7768-3. , cited from the Wikipedia article on w:Mao Zedong. The Wikipedia article does not cite a year. That is contained in Serge Halimi, "Hier, révolutionanairs et riveaux”, Le Monde Diplomatique, August 2018, "Hier, révolutionanairs et riveaux." (page 8).
  7. Owen B. Toon; Alan Robock; Michael Mills; Lili Xia (May 2017). "Asia Treads the Nuclear Path, Unaware That Self-Assured Destruction Would Result from Nuclear War". The Journal of Asian Studies 76 (02, 2): 437-456. doi:10.1017/S0021911817000080. Wikidata Q86789804. ISSN 0021-9118. http://climate.envsci.rutgers.edu/pdf/ToonAsianStudies.pdf. 
  8. Sagan, Scott D. (1993). The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton U. Pr.. p. 279. ISBN 0-691-02101-5. 
  9. Any such alternative list would need to be screened to eliminate any so-called “Broken Arrow” event, which by definition is “an accidental event that involves nuclear weapons, warheads or components that does not create a risk of nuclear war.” Of course, any particular event for which some experts believe might create a risk of war leading to a nuclear Armageddon could be included with an estimate of the probability of such escalation being appropriately low to reflect the range of expert opinion about that particular event.
  10. McNamara and Blight (2003, pp. 189-190).
  11. Ellsberg (2017, p. 206).
  12. McNamara and Blight (2003, pp. 254-255).
  13. "War Games". Dateline NBC. Burrelle's Information Services. 12 November 2000.
  14. Scott Shane (31 August 2003). "Cold War's Riskiest Moment". The Baltimore Sun. Wikidata Q61767551. ISSN 1930-8965. Archived from the original on 19 August 2006. http://hnn.us/articles/1709.html. 
  15. (26 September 2013) Stanislav Petrov: The man who may have saved the world BBC News Europe. Retrieved 26 September 2013
  16. Bruce G. Blair (11 February 2004). "Keeping Presidents in the Nuclear Dark (Episode #1: The Case of the Missing Permissive Action Links)". Bruce Blair's Nuclear Column. Wikidata Q111619559. https://www.globalzero.org/wp-content/uploads/2019/03/BB_Keeping-Presidents-in-the-Nuclear-Dark-Episode-1-The-Case-of-the-Missing-Permissive-Action-Links_02.11.2004.pdf. 
  17. A list of Blair's publications are available in Bruce G. Blair, Bruce Blair's Nuclear Column, Wikidata Q111619228
  18. Ellsberg (2017). Ellsberg, Goodman, and González (2017).
  19. For details of this see the companion vignette in the Ecfun package for R. You can get this by first installing the free, open-source R (programming language) software. Then in R, use ' install.packages("Ecfun", repos="http://R-Forge.R-project.org")' to get the package containing this vignette. Then use 'help(pac="Ecfun")' to see the package contents, then click “User guides, package vignettes and other documentation” to see the list of vignettes. Then click “R code” and “HTML” next to “Ecfun::nuclearArmageddon, Time to nuclear Armageddon” to get the code and an HTML version of the vignette. You can copy the “R code” into a new R file you create so you can execute the code line by line as you read the HTML version to replicate what you read and test alternatives to evaluate the impact of different approaches to this simulation. This vignette was added in version 0.1-9 of Ecfun, available from R-Forge. If you use simply 'install.packages(“Ecfun”)' you may get version 0.1-7, which does not have this vignette.
  20. A recent review of scientific discussion suggesting exactly this explanation of Fermi's Paradox is in Elizabeth Kolbert (18 January 2021). "An eminent astrophysicist argues that signs of intelligent extraterrestrial life have appeared in our skies. What’s the evidence for his extraordinary claim?". The New Yorker. Wikidata Q105337191. ISSN 0028-792X. https://www.newyorker.com/magazine/2021/01/25/have-we-already-been-visited-by-aliens. .
  21. Kahneman (2011)
  22. Regarding Haiti, then called Saint-Domingue, see w:George Washington and slavery, Foreign interventions by the United States, Timeline of United States military operations, and United States involvement in regime change.
  23. Ellsberg (2017); see also Ellsberg, Goodman, and González (2017).
  24. The Wikiversity article on the Great American Paradox claims that the US Postal Service Act made a much bigger contribution to making the US what it is today than the violence of the American Revolution.