- impactology
- Posts
- Existential Risks
Existential Risks
The risk of extinction is 1,000,000 times more likely than what people think

“The End of The World” by Jason Windsor, 2003
“The End of the World” animation artistically depicted the feeling of unease, uncertainty and fear resulting from multiple countries having nuclear weapons capabilities.
As I recall, I found the video to be amusing because of the different accents and perceived absurdity. It also branded into my brain the possibility of nuclear war and human extinction, further exacerbated by the growing geopolitical tensions of modern times.
20 years later I realise that the underlying message–that humanity could extinct itself–is not so absurd.
The development of nuclear weapons meant for the first time in history that we had developed a technology powerful enough to destroy ourselves. In the aftermath of the Cuban missile crisis, President John F. Kennedy estimated the chances of nuclear war were “between one in three and even”. Numerous times during the crisis it came down to just one person’s decision not to “press the button”.
The probability of human extinction within the next century have only increased over the past decades and in particular anthropogenic existential risk, risks arising from intentional or accidental human activity.
Each time we invent a new technology it comes with massive potential benefits and also risks:
Nuclear fission and fusion can be used as an awesome renewable energy source or as a weapon of mass destruction
Biotechnology can be used for vaccines and cures or for bioweapons like engineered pandemics
The power of Artificial intelligence can be used to develop impact technologies or it can be used as a tool to do harm.
What’s interesting about existential risks is the scarce amount of focus they receive vs. their importance. After all, surviving should be seen as our first priority in order to have a chance to solve all of our other problems.
It seems that our psyche has just not adjusted to this “new normal” yet when in fact this is one of the defining characteristics of modern times - the age of existential risk.
80,000 Hours, the London-based non-profit that conducts research on which careers have the largest social impact, ranks careers that reduce existential risks, for example AI-, nuclear- and bio-safety, as ones with the most high-impact potential.
What are Existential Risks?
In a survey, Americans estimated the chances of human extinction within 50 years to be extremely low, with over 30% guessing it to be under 1 in 10 million (1). However, researchers who study these issues believe the risks to be well over 1,000 times higher and potentially even more.
Natural risks like an asteroid impact, supervolcanic eruption or stellar explosion have a very low probability of happening in the next 100 years relative to the anthropogenic risks defined above, which include:
Nuclear war
Climate change
Other environmental damage, such as biodiversity loss
Naturally arising pandemics
Engineered pandemics
Unaligned artificial intelligence
Unforeseen anthropogenic risks (for example from new emerging technologies)
Other anthropogenic risks (ones which we could foresee but are more unlikely than the others listed)
Combining all of the risks together, philosopher Toby Ord estimates, that the risk of extinction in the next 100 years is about one million times higher than what people normally think. See figure below for a breakdown from the book, The Precipice: Existential Risk and the Future of Humanity.

The Precipice: Existential Risk and the Future of Humanity
The discrepancy in people’s attitude about existential risk vs the expert estimates positively corresponds with the amount of people and resources which are being dedicated to tackling these risks. The resources we invest into reducing existential risks is simply not enough.
The table below shows that for example ‘Extreme pandemic prevention’ and ‘AI safety research’ receive relatively little attention (are neglected) compared with their probability of resulting in an existential catastrophe.

Source: 80,000 Hours
Academic prioritisation shows a similar trend of disregard for such a critical issue as human extinction. This is important because what is prioritised by academia becomes legitimate knowledge. The table below from Nick Boström indicates that we are not researching existential risks enough.

Source: Existential Risk Prevention as Global Priority
Focus on safeguarding humanity
Applying the 80,000 Hours problem framework, which compares global problems in terms of scale (how many people are affected by a problem) neglectedness (how many people are working on it already) and solvability (how easy it is to make progress), shows that focusing on existential risks is a top high-impact career option.
We already looked at the scale of the problem and analysed it’s relative neglectedness…
What about solvability - what can be done about these risks?
It’s far less certain that we can make progress on these risks in the short-term, which means that reducing existential risks looks worse on solvability. However given the huge scale and neglectedness of these risks, they still are very urgent issues.
Here are some ways to reduce risks divided into three broad categories:
Targeted efforts to reduce specific risks
One approach is to address each risk directly.
For example to address pandemics one could work on disease surveillance, better collection and aggregation of data or improved technology to help spot new pandemics faster.
To reduce climate change one can work on developing new technologies, like better solar panels, or introducing a carbon tax.
To reduce the chance of unintended consequences from AI once could research the control problem within computer science.
To improve nuclear security one could work on maintaining far smaller stockpiles which would reduce the risk of accidents and the chance that if a nuclear war occurred it would end civilisation.
Broad efforts to reduce risks
Rather than trying to reduce risks individually we can work on making civilisation generally better at managing them or working on eliminating existential risk factors. Broad efforts help to reduce all the threats at once, even those we haven’t thought of yet.
For example, improving the decision-making ability of decision-makers in government would help manage risks as they arise and generally make society more resilient and able to solve problems.
Learning more and building capacity
Another key goal is to learn more about all of these issues in order to understand better which risks are the biggest, what to do about them and how to prioritise them effectively.
Global priorities research is a combination of economics and moral philosophy which aims to answer high-level questions about the most important issues for humanity.
Another way to handle uncertainty is to build up resources to be deployed later. This can mean to increase your own capabilities, through for example gaining career capital, to achieve more impact in the future. Or you can build a community around these topics which is possible to grow quickly and with a potentially high rate of return on impact.
Dive Deeper
Below I will list resources and communities to explore Existential Risks further:
Orgs:
How to use your career to reduce existential risk by 80,000 Hours
Stanford’s Existential Risks Initiative
Oxford’s Future of Humanity Institute (now closed)
There’s a great list of additional resources here!!!
Reading
The Precipice: Existential Risk and the Future of Humanity by Toby Ord
Existential Risk Prevention as Global Priority by Nick Boström
How to use your career to reduce existential risk by 80,000 Hours
The case for reducing existential risk by 80,000 Hours
Listening
Existential Risk: A conversation with Toby Ord (Episode #208) on Making Sense podcast by Sam Harris
People
I will be posting frontier stuff about sustainable development and impact every single week. If you are interested in these topics make sure to subscribe to the newsletter.
Thank you for being here!
x Verneri
Sources used for this Issue:
Todd, B. (2016, April). Existential risks: An introduction. 80,000 Hours. https://80000hours.org/articles/existential-risks/
Bostrom, N. (2011). Existential Risk. Existential Risk Organization. https://web.archive.org/web/20171102063847/http://www.existential-risk.org/concept.html