I wanted to share this update from Good Ventures (Cari and Dustin’s philanthropy), which seems relevant to the EA community.
Tl;dr: “while we generally plan to continue increasing our grantmaking in our existing focus areas via our partner Open Philanthropy, we have decided to exit a handful of sub-causes (amounting to less than 5% of our annual grantmaking), and we are no longer planning to expand into new causes in the near term by default.”
A few follow-ups on this from an Open Phil perspective:
- I want to apologize to directly affected grantees (who've already been notified) for the negative surprise here, and for our part in not better anticipating it.
- While this represents a real update, we remain deeply aligned with Good Ventures (they’re expecting to continue to increase giving via OP over time), and grateful for how many of the diverse funding opportunities we’ve recommended that they’ve been willing to tackle.
- An example of a new potential focus area that OP staff had been interested in exploring that Good Ventures is not planning to fund is research on the potential moral patienthood of digital minds. If any readers are interested in funding opportunities in that space, please reach out.
- Good Ventures has told us they don’t plan to exit any overall focus areas in the near term. But this update is an important reminder that such a high degree of reliance on one funder (especially on the GCR side) represents a structural risk. I think it’s important to diversify funding in many of the fields Good Ventures currently funds, and that doing so could make the funding base more stable both directly (by diversifying funding sources) and indirectly (by lowering the time and energy costs to Good Ventures from being such a disproportionately large funder).
- Another implication of these changes is that going forward, OP will have a higher bar for recommending grants that could draw on limited Good Ventures bandwidth, and so our program staff will face more constraints in terms of what they’re able to fund. We always knew we weren’t funding every worthy thing out there, but that will be even more true going forward. Accordingly, we expect marginal opportunities for other funders to look stronger going forward.
- Historically, OP has been focused on finding enough outstanding giving opportunities to hit Good Ventures’ spending targets, with a long-term vision that once we had hit those targets, we’d expand our work to support other donors seeking to maximize their impact. We’d already gotten a lot closer to GV’s spending targets over the last couple of years, but this update has accelerated our timeline for investing more in partnerships and advising other philanthropists. If you’re interested, please consider applying or referring candidates to lead our new partnerships function. And if you happen to be a philanthropist looking for advice on how to invest >$1M/year in new cause areas, please get in touch.
Thanks, I am glad that you are willing to do this and am somewhat relieved that your perspective on this seems more amenable to other people participating with stuff on their own terms than I was worried about. I am still concerned, and think it's unlikely that other donors for OP would be capable of getting the appropriate levels of trust with OP for this to work (and am also more broadly quite worried about chilling effects of many types here), but I do genuinely believe you are trying to mitigate those damages and see many of the same costs that I see.
Yeah, I think that's life. Acts by omission are real acts, and while the way people are judged for them are different, and the way people try to react to them generally tend to be more fraught, there are of course many acts of omission that are worthy of judgement, and many acts of omission that are worthy of praise.
I don't think reality splits neatly into "things you are responsible for" and "things you are not responsible for", and I think we seem to have a deeper disagreement here about what this means for building platforms, communities and societal institutions, which, if working correctly, practically always platform people its creators strongly disagree with or find despicable (I have found many people on LW despicable in many different ways, but my job as a discourse platform provider is to set things up to allow other people to form their own beliefs on that, not to impose my own beliefs on the community that I am supporting).
Neglecting those kinds of platforms, or pouring resources into activities that will indirectly destroy those platforms (like pouring millions of dollars into continued EA and AI Safety growth without supporting institutions that actually allow those communities and movements to have sane discourse, or to coordinate with each other, or to learn about important considerations), is an act with real consequences that of course should be taken into account when thinking about how to relate to the people responsible for that funding and corresponding lack of funding of their complement.
A manager at a company who overhires can easily tank the company if they try to not get involved with setting the right culture and onboarding processes in place and are absolutely to blame if the new hires destroy the company culture or its internal processes.
But again, my guess is we have deep, important and also (to me) interesting disagreements here, which I don't want you to feel pressured to hash out here. This isn't the kind of stuff I think one should aim to resolve in a single comment thread, and maybe not ever, but I have thought about this topic a lot and it seemed appropriate to share.