The Long-Term Future Fund (LTFF) website states that it mainly focuses on risks from advanced AI but welcomes applications for other global catastrophic risks (GCR) such as pandemics or nuclear conflict.
However, I've begun to question whether this description accurately reflects their funding priorities. When I examine the grants LTFF awards, they are overwhelmingly focused on AI. There are also some biosecurity grants and community infrastructure projects. What I haven't been able to find in recent years are any funded projects addressing GCRs other than AI or biosecurity.
Since I have applied for such projects and know several others who have done the same, this absence of funded projects doesn't appear to stem from a lack of applications.
This pattern raises questions about whether non-AI/biosecurity GCR projects that LTFF would actually fund exist. I would appreciate it if someone with more insight into LTFF's priorities could either provide an example of what such a project would need to look like to secure funding, or if LTFF could clarify whether they currently plan to fund non-AI/biosecurity GCR projects.
Thanks for the clarification. In that case I think it would be helpful to state on the website that the LTFF won't be funding non AI/biosecurity GCR work for the foreseeable future. Otherwise you will just attract applications which you would not fund anyway, which results in unnecessary effort for both applicants and reviewers.