The Long-Term Future Fund (LTFF) is one of the EA Funds. Between Friday Dec 4th and Monday Dec 7th, we'll be available to answer any questions you have about the fund – we look forward to hearing from all of you!
The LTFF aims to positively influence the long-term trajectory of civilization by making grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. In addition, we seek to promote, implement, and advocate for longtermist ideas, and to otherwise increase the likelihood that future generations will flourish.
Grant recommendations are made by a team of volunteer Fund Managers: Matt Wage, Helen Toner, Oliver Habryka, Adam Gleave and Asya Bergal. We are also fortunate to be advised by Nick Beckstead and Nicole Ross. You can read our bios here. Jonas Vollmer, who is heading EA Funds, also provides occasional advice to the Fund.
You can read about how we choose grants here. Our previous grant decisions and rationale are described in our payout reports. We'd welcome discussion and questions regarding our grant decisions, but to keep discussion in one place, please post comments related to our most recent grant round in this post.
Please ask any questions you like about the fund, including but not limited to:
- Our grant evaluation process.
- Areas we are excited about funding.
- Coordination between donors.
- Our future plans.
- Any uncertainties or complaints you have about the fund. (You can also e-mail us at ealongtermfuture[at]gmail[dot]com for anything that should remain confidential.)
We'd also welcome more free-form discussion, such as:
- What should the goals of the fund be?
- What is the comparative advantage of the fund compared to other donors?
- Why would you/would you not donate to the fund?
- What, if any, goals should the fund have other than making high-impact grants? Examples could include: legibility to donors; holding grantees accountable; setting incentives; identifying and training grant-making talent.
- How would you like the fund to communicate with donors?
We look forward to hearing your questions and ideas!
From an internal perspective I'd view the fund as being fairly close to risk-neutral. We hear around twice as many complaints that we're too risk-tolerant than too risk-averse, although of course the people who reach out to us may not be representative of our donors as a whole.
We do explicitly try to be conservative around things with a chance of significant negative impact to avoid the unilateralist's curse. I'd estimate this affects less than 10% of our grant decisions, although the proportion is higher in some areas, such as community building, biosecurity and policy.
It's worth noting that, unless I see a clear case for a grant, I tend to predict a low expected value -- not just a high-risk opportunity. This is because I think most projects aren't going to positively influence the long-term future -- otherwise the biggest risks to our civilization would already be taken care of. Based on that prior, it takes significant evidence to update me in favour of a grant having substantial positive expected value. This produces similar decisions to risk-aversion with a more optimistic prior.
Unfortunately, it's hard to test this prior: we'd need to see how good the grants we didn't make would have been. I'm not aware of any grants we passed on that turned out to be really good. But I haven't evaluated this systematically, and we'd only know about those which someone else chose to fund.
An important case where donors may be better off making donations themselves rather than donating via us is when they have more information than we do about some promising donation opportunities. In particular, you likely hear disproportionately about grants we rejected from people already in your network. You may be in a much better position to evaluate these than we are, especially if the impact of the grant hinges on the individual's abilities, or requires a lot of context to understand.
It's unfortunate that individual donors can't directly make grants to individuals in a tax efficient manner. You could consider donating to a donor lottery -- these will allow you to donate the same amount of money (in expectation) in a tax efficient manner. While grants can only be made within CEA's charitable objects, this should cover the majority of things donors would want to support, and in any case the LTFF also faces this restriction. (Jonas also mentioned to me that EA Funds is considering offering Donor-Advised Funds that could grant to individuals as long as there’s a clear charitable benefit. If implemented, this would also allow donors to provide tax-deductible support to individuals.)