Hide table of contents

The Long-Term Future Fund (LTFF) website states that it mainly focuses on risks from advanced AI but welcomes applications for other global catastrophic risks (GCR) such as pandemics or nuclear conflict.

However, I've begun to question whether this description accurately reflects their funding priorities. When I examine the grants LTFF awards, they are overwhelmingly focused on AI. There are also some biosecurity grants and community infrastructure projects. What I haven't been able to find in recent years are any funded projects addressing GCRs other than AI or biosecurity.

Since I have applied for such projects and know several others who have done the same, this absence of funded projects doesn't appear to stem from a lack of applications.

This pattern raises questions about whether non-AI/biosecurity GCR projects that LTFF would actually fund exist. I would appreciate it if someone with more insight into LTFF's priorities could either provide an example of what such a project would need to look like to secure funding, or if LTFF could clarify whether they currently plan to fund non-AI/biosecurity GCR projects.

18

1
0

Reactions

1
0
New Answer
New Comment

1 Answers sorted by

We evaluate grants in other longtermist areas but you’re correct that it’s rare for us to fund things that aren’t AI or bio (and biosecurity grants more recently have been relatively rare).  We occasionally fund work in forecasting, macrostrategy, and fieldbuilding. 

It’s possible that we’ll support a broader array of causes in the future but until we make an announcement I think the status quo of investigating a range of areas in longtermism and then funding the things that seem most promising to us (as represented by our public reporting) will persist.

Thanks for the clarification. In that case I think it would be helpful to state on the website that the LTFF won't be funding non AI/biosecurity GCR work for the foreseeable future. Otherwise you will just attract applications which you would not fund anyway, which results in unnecessary effort for both applicants and reviewers.

3
calebp
We do fund a small amount of non AI/bio work so it seems bad to rule those areas out.  It could be worth bringing more attention to the breakdown of our public grants if the application distribution is very different to the funded one, I’ll check next week internally to see if that’s the case.
3
FJehn
I meant specifically mentioning that you don't really fund global catastrophic risk work on climate change, ecological collapse, near-Earth objects (e.g., asteroids, comets), nuclear weapons, and supervolcanic eruptions. Because to my knowledge such work has not been funded for several years now (please correct me if this is wrong). And as you mentioned that status quo will continue, I don't really see a reason to expect that the LTFF will start funding such work in the foreseeable future.  Thanks for wanting to check in if there is a difference between the public grants and the application distribution. Would be curious to hear the results. 
More from FJehn
Curated and popular this week
Relevant opportunities