Hide table of contents

Open Philanthropy has spent 828 M 2022-$ in its grantmaking portfolio of global catastrophic risks[1] (GCRs). However, it has not yet published any detailed quantitative models which estimate GCRs (relatedly), which I believe would be important to inform both efforts to mitigate them and cause prioritisation. I am thinking about models like Tom Davidson's, which estimates AI takeoff speeds, but outputting the probability of a given annual loss of population or drop in real gross domestic product.

  1. ^

    According to Open Philanthropy's grants database on 17 February 2024, accounting for the focus areas of "Biosecurity & Pandemic Preparedness", "Forecasting", "Global Catastrophic Risks", "Global Catastrophic Risks Capacity Building", and "Potential Risks from Advanced AI".




New Answer
New Comment
Sorted by Click to highlight new comments since:

What about "Is Power-Seeking AI an Existential Risk?"?

I don't know if you'd count it as quantitative, but it is detailed.

Thanks for the comment, Ryan. I agree that report by Joseph Carlsmith is quite detailed. However, I do not think it is sufficiently quantitative. In particular, the probabilities which are multiplied to obtain the chance of an existential catastrophe are directly guessed, as opposed to resulting from detailed modelling (in contrast to the AI takeoff speeds calculated in Tom's report). Joseph was mostly aiming to qualitatively describe the arguments, as opposed to quantifying the risk:

My main hope, though, is not to push for a specific number, but rather to lay out the arguments in a way that can facilitate productive debate.

I think it's quite possible that OP has built quantitative models which estimate GCR, but that they haven't published them (e.g. they use them internally).

Hi Saul,

I assume Open Philanthropy (OP) has built quantitative models which estimate GCR, but probably just simple ones, as I would expect a model like Tom's to be published. There may be concerns about information hazards in the context of bio risk, but OP had an approach to quantify it while mitigate them:

A second, less risky approach is to abstract away most biological details and instead consider general ‘base rates’. The aim is to estimate the likelihood of a biological attack or accident using historical data and base rates of analogous scenarios, and of risk factors such as warfare or terrorism.

Curated and popular this week
Relevant opportunities