In October of 2018, I developed a question series on Metaculus related to extinction events spanning risks from nuclear war, bio-risk, risks from climate change and geo-engineering, Artificial Intelligence risk, and risks from nanotechnology failure modes. Since then, these questions have accrued over 3,000 predictions (ETA: as of today, there the number is around 5,000).
Catastrophes were defined as a reduction in the human population of at least 10% in any period of 5 years or less. (Near) extinction is defined as an event that reduces the human population by at least 10% within 5 years, and by at least 95% within 25 years.
Here's a summary of the results as they stand today (September 24, 2023), ordered by risk of near extinction:
Global catastrophic risk | Chance of catastrophe by 2100 | Chance of (near) extinction by 2100 |
Artificial Intelligence | 6.16% | 3.39% |
Other risks | 1.52% | 0.13% |
Biotechnology or bioengineered pathogens | 1.52% | 0.07% |
Nuclear war | 2.86% | 0.06% |
Nanotechnology | 0.02% | 0.01% |
Climate change or geo-engineering | 0.00% | 0.00% |
Natural pandemics | 0.62% | N/A |
These predictions are generated by aggregating forecasters' individual predictions based on their track records. Specifically, the predictions are weighted by a function of the forecasters' level of 'skill', where 'skill' is estimated with data on relative performance on a number (typically many hundreds) of resolved forecasts.
If we assume that these events are independent, the predictions suggest that there's at a ~17% chance of catastrophe, and a ~1.9% chance of (near) extinction by the end of the century. Admittedly, independence is likely to be an inappropriate assumption, since, for example, some catastrophes could exacerbate other global catastrophic risks.[1]
Interestingly, the predictions indicate that although nuclear risk and bioengineered pathogens are most likely to result in a major catastrophe, an AI failure mode is by far the biggest source of extinction-level risk—it is at least 5-times more likely to cause near extinction than all other risks combined.
Links to all the questions on which these predictions are based may be found here.
For reference, these were the estimates when I first posted this (19 Jun 2022):
Global catastrophic risk | Chance of catastrophe by 2100 | Chance of (near) extinction by 2100 |
Artificial Intelligence | 3.06% | 1.56% |
Other risks | 1.36% | 0.11% |
Biotechnology or bioengineered pathogens | 2.21% | 0.07% |
Nuclear war | 1.87% | 0.06% |
Nanotechnology | 0.17% | 0.06% |
Climate change or geo-engineering | 0.51% | 0.01% |
Natural pandemics | 0.51% | n/a |
Hi Benjamin, these are great questions! I work with Metaculus and wanted to add a bit of color here:
To your question about how to see the Metaculus Prediction, that's located: https://www.metaculus.com/help/faq/#tachyon-costs
—basically one has to be of a sufficient "level", and then pay out some tachyons (the coin of the realm) to unlock the Metaculus Prediction for that question. That said, in this case, we're happy to share the current MP with you. (I'll message you here in a moment.)
And as to how the MP is calculated, the best resource there was written by one of the founders, and lives in this blog post: https://metaculus.medium.com/a-primer-on-the-metaculus-scoring-rule-eb9a974cd204
To your question about catastrophic risk from an unknown source, the table in the post doesn't include that bit, as it's only summing the %s of the different catastrophic risk questions, but you're right that you can get something like it from the question you link to:
Which just refers to that 10% decrease by any means, full stop. The Metaculus Prediction there is lower than the Community Prediction, FYI, but is indeed above the 14% you get from summing the other questions. So that makes some sense given that there are the other possibilities, however remote, that are not explicitly named. But it's also true that there are different predictors on each question, and also the linked to forecast is not explicitly pitched as "summing the other catastrophes up gives you 14% and so this linked to question is meant to produce a forecast of 14+X%, where X is the probability of unnamed catastrophes."
I hope that was useful. Please do reach out if you'd like to continue the conversation.