Hi everyone!
Managers of the EA Infrastructure Fund will be available for an Ask Me Anything session. We'll start answering questions on Friday, June 4th, though some of us will only be able to answer questions the week after. Nevertheless, if you would like to make sure that all fund managers can consider your question, you might want to post it before early UK time on Friday morning.
What is the EA Infrastructure Fund?
The EAIF is one of the four EA Funds. While the other three Funds support direct work on various causes, this Fund supports work that could multiply the impact of direct work, including projects that provide intellectual infrastructure for the effective altruism community, run events, disseminate information, or fundraise for effective charities.
Who are the fund managers, and why might you want to ask them questions?
The fund managers are Max Daniel, Michelle Hutchinson, and Buck Shlegeris. In addition, EA Funds Executive Director Jonas Vollmer is temporarily taking on chairperson duties, advising, and voting consultatively on grants. Ben Kuhn was a guest manager in our last grant round. They will all be available for questions, though some may have spotty availability and might post their answers as they have time throughout next week.
One particular reason why you might want to ask us questions is that we are all new in these roles: All fund managers of the EAIF have recently changed, and this was our first grant round.
What happened in our most recent grant round?
We have made 26 grants totalling about $1.2 million. They include:
- Two grants totalling $139,200 to Emma Abele, James Aung, Bella Forristal, and Henry Sleight. They will work together to identify and implement new ways to support EA university groups – e.g., through high-quality introductory talks about EA and creating other content for workshops and events. University groups have historically been one of the most important sources of highly engaged EA community members, and we believe there is significant untapped potential for further growth. We are also excited about the team, based significantly on their track record – e.g., James and Bella previously led two of the globally most successful university groups.
- $41,868 to Zak Ulhaq to develop and implement workshops aimed at helping highly talented teenagers apply EA concepts and quantitative reasoning to their lives. We are excited about this grant because we generally think that educating pre-university audiences about EA-related ideas and concepts could be highly valuable; e.g., we’re aware of (unpublished) survey data indicating that in a large sample of highly engaged community members who learned about EA in the last few years, about ¼ had first heard of EA when they were 18 or younger. At the same time, this space seems underexplored. Projects that are mindful of the risks involved in engaging younger audiences therefore have a high value of information – if successful, they could pave the way for many more projects of this type. We think that Zak is a good fit for efforts in this space because he has a strong technical background and experience with both teaching and EA community building.
- $5,000 to the Czech Association for Effective Altruism to give away EA-related books to people with strong results in Czech STEM competitions, AI classes, and similar. We believe that this is a highly cost-effective way to engage a high-value audience; long-form content allows for deep understanding of important ideas, and surveys typically find books have helped many people become involved with EA (e.g., in the 2020 EA Survey, more than ⅕ of respondents said a book was important for getting them more involved).
- $248,300 to Rethink Priorities to allow Rethink to take on nine research interns (7 FTE) across various EA causes, plus support for further EA movement strategy research. We have been impressed with Rethink’s demonstrated ability to successfully grow their team while maintaining a constant stream of high-quality outputs, and think this puts them in a good position to provide growth opportunities for junior researchers. They also have a long history of doing empirical research relevant to movement strategy (e.g., the EA survey), and we are excited about their plans to build upon this track record by running additional surveys illuminating how various audiences think of EA and how responsive they are to EA messaging.
For more detail, see our payout report. It covers all grants from this round and provides more detail on our reasoning behind some of them.
The application deadline for our next grant round will be the 13th of June. After this round is wrapped up, we plan to accept rolling applications.
Ask any questions you like; we'll respond to as many as we can.
My knee-jerk reaction is: If "net negative" means "ex-post counterfactual impact anywhere below zero, but including close-to-zero cases" then it's close to 50% of grantees. Important here is that "impact" means "total impact on the universe as evaluated by some omniscient observer". I think it's much less likely that funded projects are net negative by the light of their own proxy goals or by any criterion we could evaluate in 20 years (assuming no AGI-powered omniscience or similar by then).
(I still think that the total value of the grantee portfolio would be significantly positive b/c I'd expect the absolute values to be systematically higher for positive than for negative grants.)
This is just a general view I have. It's not specific to EA Funds, or the grants this round. It applies to basically any action. That view is somewhat considered but I think also at least somewhat controversial. I have discussed it a bit but not a lot with others, so I wouldn't be very surprised if someone replied to this comment saying "but this can't be right because of X", and then I'd be like "oh ok, I think you're right, the close-to-50% figure now seems massively off to me".
--
If "net negative" means "significantly net negative" (though I'm not sure what the interesting bar for "significant" would be), then I'm not sure I have a strong prior. Glancing over the specific grants we made I feel that for very roughly 1/4 of them I have some vague sense that "there is a higher-than-baseline risk for this being significantly net negative". But idk what that higher-than-baseline risk is as absolute probability, and realistically I think all that's going on here is that for about 1/4 of grants I can easily generate some prototypical story for why they'd turn out to be significantly net negative. I don't know how well this is correlated with the actual risk.
(NB I still think that the absolute values for 'significantly net negative' grants will be systematically smaller than for 'significantly net positive' ones. E.g., I'd guess that the 99th percentile ex-post impact grant much more than offsets the 1st percentile grant [which I'm fairly confident is significantly net negative].)