In light of recent discourse on EA adjacency, this seems like a good time to publicly note that I still identify as an effective altruist, not EA adjacent.
I am extremely against embezzling people out of billions of dollars of money, and FTX was a good reminder of the importance of "don't do evil things for galaxy brained altruistic reasons". But this has nothing to do with whether or not I endorse the philosophy that "it is correct to try to think about the most effective and leveraged ways to do good and then actually act on them". And there are many people in or influenced by the EA community who I respect and think do good and important work.
I used to feel so strongly about effective altruism. But my heart isn't in it anymore.
I still care about the same old stuff I used to care about, like donating what I can to important charities and trying to pick the charities that are the most cost-effective. Or caring about animals and trying to figure out how to do right by them, even though I haven't been able to sustain a vegan diet for more than a short time. And so on.
But there isn't a community or a movement anymore where I want to talk about these sorts of things with people. That community and movement existed, at least in my local area and at least to a limited extent in some online spaces, from about 2015 to 2017 or 2018.
These are the reasons for my feelings about the effective altruist community/movement, especially over the last one or two years:
-The AGI thing has gotten completely out of hand. I wrote a brief post here about why I strongly disagree with near-term AGI predictions. I wrote a long comment here about how AGI's takeover of effective altruism has left me disappointed, disturbed, and alienated. 80,000 Hours and Will MacAskill have both pivoted to focusing exclusively or almost exclusively on AGI. AGI talk has dominated the EA Forum for a while. It feels like AGI is what the movement is mostly about now, so now I just disagree with most of what effective altruism is about.
-The extent to which LessWrong culture has taken over or "colonized" effective altruism culture is such a bummer. I know there's been at least a bit of overlap for a long time, but ten years ago it felt like effective altruism had its own, unique culture and nowadays it feels like the LessWrong culture has almost completely taken over. I have never felt good about LessWrong or "rationalism" and the more knowledge and experience of it I've gained, the more I've accumulated a sense of repugnance, horror, and anger toward that culture and ideology. I hate to see that become what effective altruism is like.
-The stori
I wanted to share some insights from my reflection on my mistakes around attraction/power dynamics — especially something about the shape of the blindspots I had. My hope is that this might help to avert cases of other people causing harm in similar ways.
I don’t know for sure how helpful this will be; and I’m not making a bid for people to read it (I understand if people prefer not to hear more from me on this); but for those who want to look, I’ve put a couple of pages of material here.
I've been thinking a lot about how mass layoffs in tech affect the EA community. I got laid off early last year, and after job searching for 7 months and pivoting to trying to start a tech startup, I'm on a career break trying to recover from burnout and depression.
Many EAs are tech professionals, and I imagine that a lot of us have been impacted by layoffs and/or the decreasing number of job openings that are actually attainable for our skill level. The EA movement depends on a broad base of high earners to sustain high-impact orgs through relatively small donations (on the order of $300-3000)—this improves funding diversity and helps orgs maintain independence from large funders like Open Philanthropy. (For example, Rethink Priorities has repeatedly argued that small donations help them pursue projects "that may not align well with the priorities or constraints of institutional grantmakers.")
It's not clear that all of us will be able to continue sustaining the level of donations we historically have, especially if we're forced out of the job markets that we spent years training and getting degrees for. I think it's incumbent on us to support each other more to help each other get back to a place where we can earn to give or otherwise have a high impact again.
I'm going to be leaving 80,000 Hours and joining Charity Entrepreneurship's incubator programme this summer!
The summer 2023 incubator round is focused on biosecurity and scalable global health charities and I'm really excited to see what's the best fit for me and hopefully launch a new charity. The ideas that the research team have written up look really exciting and I'm trepidatious about the challenge of being a founder but psyched for getting started. Watch this space! <3
I've been at 80,000 Hours for the last 3 years. I'm very proud of the 800+ advising calls I did and feel very privileged I got to talk to so many people and try and help them along their careers!
I've learned so much during my time at 80k. And the team at 80k has been wonderful to work with - so thoughtful, committed to working out what is the right thing to do, kind, and fun - I'll for sure be sad to leave them.
There are a few main reasons why I'm leaving now:
1. New career challenge - I want to try out something that stretches my skills beyond what I've done before. I think I could be a good fit for being a founder and running something big and complicated and valuable that wouldn't exist without me - I'd like to give it a try sooner rather than later.
2. Post-EA crises stepping away from EA community building a bit - Events over the last few months in EA made me re-evaluate how valuable I think the EA community and EA community building are as well as re-evaluate my personal relationship with EA. I haven't gone to the last few EAGs and switched my work away from doing advising calls for the last few months, while processing all this. I have been somewhat sad that there hasn't been more discussion and changes by now though I have been glad to see more EA leaders share things more recently (e.g. this from Ben Todd). I do still believe there are some really important ideas that EA prioritises but I'm more circumspect about some of the things I think we're not doing as well as we could (
I would like to publicly set a goal not to comment other people's posts with a criticism of some minor side point that doesn't matter. I have a habit of doing that, but I think it's usually more annoying than it is helpful so I would like to stop. If you see me doing it, feel free to call me out
(I reserve the right to make substantive criticisms of a post's central arguments)
I’ve been working a few hours per week at the Effective Altruism Infrastructure Fund as a Fund Manager since Summer this year.
EA’s reputation is at a bit of a low point. I’ve even heard EA described as the ‘boogeyman’ in certain well-meaning circles. So why do I feel inclined to double down on effective altruism rather than move onto other endeavours? Some shower thoughts:
* I generally endorse aiming directly for the thing you actually care about. It seems higher integrity, and usually more efficient. I want to do the most good possible, and this goal already has a name and community attached to it; EA.
* I find the core, underlying principles very compelling. The Centre for Effective Altruism highlights scope sensitivity, impartiality, recognition of tradeoffs, and the Scout Mindset. I endorse all of these!
* Seems to me that EA has a good track record of important insights on otherwise neglected topics. Existential risk, risks of astronomical suffering, AI safety, wild animal suffering; I attribute a lot of success in these nascent fields to the insights of people with a shared commitment to EA principles and goals.
* Of course, there’s been a lot of progress on slightly less neglected cause areas too. The mind boggles at the sheer number of human lives saved and the vast amount of animal suffering reduced by organisations funded by Open Philanthropy, for example.
* I have personally benefited massively in achieving my own goals. Beyond some of the above insights, I attribute many improvements in my productivity and epistemics to discussions and recommendations that arose out of the pursuit of EA.
* In other roles or projects I’m considering, when I think of questions like “who will actually realistically consider acting on this idea I think is great? Giving up their time or money to make this happen?” the most obvious and easiest answer often looks like some subset of the EA community. Obviously there are some echo chamber-y and bias-related reasons tha
Apply now for EA Global: London 2025 happening June 6–8. Applications close on May 18 at 11:59 pm BST (apply here)!
We're excited to be hosting what's shaping up to be our biggest EAG yet at the InterContinental London–The O2. We expect to welcome over 1,500 attendees.
We have some travel funding available. More information can be found on the event page and EA Global FAQ.
If you have any questions, please email us at hello@eaglobal.org!