Urgent News Flash: Humanity is Worried About the Wrong Risks!

It's been a while since I last posted on Debunking Christianity, I know! Readers may recall that my central project right now (since my book came out) is trying to initiate a desperately-needed, extremely important conversation between the secular movement and the existential risk community. In sum, the former is far, far more important than it even realizes because of the latter, and the latter is failing in its effort to keep the world safe because it ignores the target of the former (namely religion). In a forthcoming "Technical Report" from my fledgling organization, the X-Risks Institute, I try a new strategy for getting existential riskologists and new atheists to talk about the future of humanity. But readers will have to wait another week for more details!

For the nonce, I thought it might be worth sharing an article I recently wrote for the Future of Life Institute, at which I am a contributing writer. (Their Advisory Board consists of people like George Church, Stephen Hawking, and Morgan Freeman. And they recently received an Everest-sized mountain of money from Elon Musk.) The article basically shows that on even the most conservative estimates offered by existential risk experts -- some whom are genuinely respectable intellectuals -- the probability of human annihilation is profoundly high. For example, if Nick Bostrom is right, then the average person's chances of dying in a plane crash are roughly 20 million times lower than encountering an existential catastrophe. In other words, the average person's grave this century is much more likely to say "Died in an existential disaster" than it is to say "Died in a plane crash." (Pause for a moment to allow the squishy computer behind your eyes to process that information!) And one of the major culprits of such a catastrophe stems from religious fanatics. Thus the link between new atheism and existential risks.

Anyway, without further delay, here's the article. Enjoy a bit of deliciously dark number-crunching!

Via the Future of Life Institute
People tend to worry about the wrong things.
According to a 2015 Gallup Poll, 51% of Americans are “very worried” or “somewhat worried” that a family member will be killed by terrorists. Another Gallup Poll found that 11% of Americans are afraid of “thunder and lightning.” Yet the average person is at least four times more likely to die from a lightning bolt than a terrorist attack.
Similarly, statistics show that people are more likely to be killed by a meteorite than a lightning strike (here’s how). Yet I suspect that most people are less afraid of meteorites than lightning. In these examples and so many others, we tend to fear improbable events while often dismissing more significant threats.
One finds a similar reversal of priorities when it comes to the worst-case scenarios for our species: existential risks. These are catastrophes that would either annihilate humanity or permanently compromise our quality of life. While risks of this sort are often described as “high-consequence, improbable events,” a careful look at the numbers by leading experts in the field reveals that they are far more likely than most of the risks people worry about on a daily basis.
Let’s use the probability of dying in a car accident as a point of reference. Dying in a car accident is more probable than any of the risks mentioned above. According to the 2016 Global Challenges Foundation report, “The annual chance of dying in a car accident in the United States is 1 in 9,395.” This means that if the average person lived 80 years, the odds of dying in a car crash will be 1 in 120. (In percentages, that’s 0.01% per year, or 0.8% over a lifetime.)
Compare this to the probability of human extinction stipulated by the influential “Stern Review on the Economics of Climate Change,” namely 0.1% per year. A human extinction event could be caused by an asteroid impact, supervolcanic eruption, nuclear war, a global pandemic, or a superintelligence takeover. Although this figure appears small, over time it can grow quite significant. For example, it means that the likelihood of human extinction over the course of a century is 9.5%. It follows that your chances of dying in a human extinction event are nearly 10 times higher than dying in a car accident.
But how seriously should we take the 9.5% figure? Is it a plausible estimate of human extinction? The Stern Review is explicit that the number isn’t based on empirical considerations; it’s merely a useful assumption. The scholars who have considered the evidence, though, generally offer probability estimates higher than 9.5%. For example, a 2008 survey taken during a Future of Humanity Institute conference put the likelihood of extinction this century at 19%. The philosopher and futurist Nick Bostrom argues that it “would be misguided” to assign a probability of less than 25% to an existential catastrophe before 2100, adding that “the best estimate may be considerably higher.” And in his book Our Final HourSir Martin Rees claims that civilization has a fifty-fifty chance of making it through the present century.
My own view more or less aligns with Rees’, given that future technologies are likely to introduce entirely new existential risks. A discussion of existential risks five decades from now could be dominated by scenarios that are unknowable to contemporary humans, just like nuclear weapons, engineered pandemics, and the possibility of “grey goo” were unknowable to people in the fourteenth century. This suggests that Rees may be underestimating the risk, since his figure is based on an analysis of currently known technologies.
If these estimates are believed, then the average person is 19 times25 times, or even 50 times more likely to encounter an existential catastrophe than to perish in a car accident, respectively.
These figures vary so much in part because estimating the risks associated with advanced technologies requires subjective judgments about how future technologies will develop. But this doesn’t mean that such judgments must be arbitrary or haphazard: they can still be based on technological trends and patterns of human behavior. In addition, other risks like asteroid impacts and supervolcanic eruptions can be estimated by examining the relevant historical data. For example, we know that an impactor capable of killing “more than 1.5 billion people” occurs every 100,000 years or so, and supereruptions happen about once every 50,000 years.
Nonetheless, it’s noteworthy that all of the above estimates agree that people should be more worried about existential risks than any other risk mentioned.
Yet how many people are familiar with the concept of an existential risk? How often do politicians discuss large-scale threats to human survival in their speeches? Some political leaders — including one of the candidates currently running for president — don’t even believe that climate change is real. And there are far more scholarly articles published about dung beetles and Star Trek than existential risks. This is a very worrisome state of affairs. Not only are the consequences of an existential catastropheirreversible — that is, they would affect everyone living at the time plus all future humans who might otherwise have come into existence — but the probability of one happening is far higher than most people suspect.
Given the maxim that people should always proportion their fears to the best available evidence, the rational person should worry about the above risks in the following order (from least to most risky): terrorism, lightning strikes, meteorites, car crashes, and existential catastrophes. The psychological fact is that our intuitions often fail to track the dangers around us. So, if we want to ensure a safe passage of humanity through the coming decades, we need to worry less about the Islamic State and al-Qaeda, and focus more on the threat of an existential catastrophe.
 

0 comments: