The EA Infrastructure Fund (EAIF) is running an Ask Us Anything! This is a time where EAIF grantmakers have set aside some time to answer questions on the Forum. I (Tom) will aim to answer most questions next weekend (~January 20th), so please submit questions by the 19th.
Please note: We believe the next three weeks are an especially good time to donate to EAIF, because:
- We continue to face signficant funding constraints, leading to many great projects going either unfunded or underfunded
- Your donation will be matched at a 2:1 ratio until Feb 2. EAIF has ~$2m remaining in available matching funds, meaning that (unlike LTFF) this match is unlikely to be utilised without your support
If you agree, you can donate to us here.
About the Fund
The EA Infrastructure Fund aims to increase the impact of projects that use the principles of effective altruism, by increasing their access to talent, capital, and knowledge.
Over 2022 and H1 2023, we made 347 grants totalling $13.4m in dispersement. You can see our public grants database here.
Related posts
- EA Infrastructure Fund's Plan to Focus on Principles-First EA
- LTFF and EAIF are unusually funding-constrained right now
- EA Funds organizational update: Open Philanthropy matching and distancing
- EA Infrastructure Fund: June 2023 grant recommendations
- What do Marginal Grants at EAIF Look Like? Funding Priorities and Grantmaking Thresholds at the EA Infrastructure Fund
About the Team
- Tom Barnes: Tom is currently a Guest Fund Manager at EA Infrastructure Fund (previously an Assistant Fund Manager since ~Oct 2022). He also works as an Applied Researcher at Founders Pledge, currently on secondment to the UK Government to work on AI policy. Previously, he was a visiting fellow at Rethink Priorities, and was involved in EA uni group organizing.
- Caleb Parikh: Caleb is the project lead of EA Funds. Caleb has previously worked on global priorities research as a research assistant at GPI, EA community building (as a contractor to the community health team at CEA), and global health policy. Caleb currently leads EAIF as interim chair.
- Linchuan Zhang: Linchuan (Linch) Zhang currnetly works full-time at EA Funds. He was previously a Senior Researcher at Rethink Priorities working on existential security research. Before joining RP, he worked on time-sensitive forecasting projects around COVID-19. Previously, he programmed for Impossible Foods and Google and has led several EA local groups.
Ask Us Anything
We’re happy to answer any questions – marginal uses of money, how we approach grants, questions/critiques/concerns you have in general, what reservations you have as a potential donor or applicant, etc.
There’s no hard deadline for questions, but I would recommend submitting by the 19th January as I aim to respond from the 20th
As a reminder, we remain funding-constrained, and your donation will be matched (for every $1 you donate, EAIF will receive $3). Please consider donating!
If you have projects relevant to builiding up the EA community's infrastructure, you can also apply for funding here.
(note that I'm not speaking about CEEALAR or any other specific EAIF applicants/grantees specifically)
I understand that CEEALAR has created a low-cost hotel/coworking space in the UK for relatively junior people to stay while they work on research projects relevant to GCRs. I think that you had some strategic updates recently so some of my impression of your work may be out of date. Supporting people early on in their impact-focused careers seems really valuable, I've seen lots of people go through in-person retreats and quickly start doing valuable work.
At the same time, I think projects that take lots of junior people and put them in the same physical space for an extended period whilst asking them to work on important and thorny questions have various risks (e.g. negative effects on mental health, attracting negative press to EA, trapping people in suboptimal learning environments).
I think some features of projects in this reference class I'd be excited to see (though it's NOT a list of requirements):
* located in an existing hub so that program participants have plenty of people outside the program to interact with
* generally taking people with good counterfactual options outside of EA areas so that people don't feel "trapped" and because this is correlated with being able to do very useful stuff within EA cause areas quickly
* trying to foster an excellent intellectual environment - ideally, there would be a critical mass of thoughtful people and truth-seeking epistemic norms
* having a good track record of a high proportion of people leaving and entering high-impact roles
* taking community health seriously, incidents should be handled in a professional manner and generally, projects should adhere to sensible best practices (e.g. amongst full-time staff, there shouldn't be romantic relationships between managers and their direct reports)
I recently spent some time in the Meridian Office, a co-working space in Cambridge UK for people working on pressing problems, which seems to be doing a good job on all of the points above (though I haven't evaluated them properly).
(Note that I don't mean to imply that CEEALAR is or isn't doing well on the above points, as I don't want to talk about specific EAIF grantees.)