Building effective altruism
Building EA
Growing, shaping, or otherwise improving effective altruism as a practical and intellectual project

Quick takes

18
15h
As a community builder, I've started donating directly to my local EA group—and I encourage you to consider doing the same. Managing budgets and navigating inflexible grant applications consume valuable time and energy that could otherwise be spent directly fostering impactful community engagement. As someone deeply involved, I possess unique insights into what our group specifically needs, how to effectively meet those needs, and what actions are most conducive to achieving genuine impact. Of course, seeking funding from organizations like OpenPhil remains highly valuable—they've dedicated extensive thought to effective community building. Yet, don't underestimate the power and efficiency of utilizing your intimate knowledge of your group's immediate requirements. Your direct donations can streamline processes, empower quick responses to pressing needs, and ultimately enhance the impact of your local EA community.
46
12d
Here are my rules of thumb for improving communication on the EA Forum and in similar spaces online: * Say what you mean, as plainly as possible. * Try to use words and expressions that a general audience would understand. * Be more casual and less formal if you think that means more people are more likely to understand what you're trying to say. * To illustrate abstract concepts, give examples. * Where possible, try to let go of minor details that aren't important to the main point someone is trying to make. Everyone slightly misspeaks (or mis... writes?) all the time. Attempts to correct minor details often turn into time-consuming debates that ultimately have little importance. If you really want to correct a minor detail, do so politely, and acknowledge that you're engaging in nitpicking. * When you don't understand what someone is trying to say, just say that. (And be polite.) * Don't engage in passive-aggressiveness or code insults in jargon or formal language. If someone's behaviour is annoying you, tell them it's annoying you. (If you don't want to do that, then you probably shouldn't try to communicate the same idea in a coded or passive-aggressive way, either.) * If you're using an uncommon word or using a word that also has a more common definition in an unusual way (such as "truthseeking"), please define that word as you're using it and — if applicable — distinguish it from the more common way the word is used. * Err on the side of spelling out acronyms, abbreviations, and initialisms. You don't have to spell out "AI" as "artificial intelligence", but an obscure term like "full automation of labour" or "FAOL" that was made up for one paper should definitely be spelled out. * When referencing specific people or organizations, err on the side of giving a little more context, so that someone who isn't already in the know can more easily understand who or what you're talking about. For example, instead of just saying "MacAskill" or "Will", say "Wi
40
1mo
6
I used to feel so strongly about effective altruism. But my heart isn't in it anymore. I still care about the same old stuff I used to care about, like donating what I can to important charities and trying to pick the charities that are the most cost-effective. Or caring about animals and trying to figure out how to do right by them, even though I haven't been able to sustain a vegan diet for more than a short time. And so on. But there isn't a community or a movement anymore where I want to talk about these sorts of things with people. That community and movement existed, at least in my local area and at least to a limited extent in some online spaces, from about 2015 to 2017 or 2018. These are the reasons for my feelings about the effective altruist community/movement, especially over the last one or two years: -The AGI thing has gotten completely out of hand. I wrote a brief post here about why I strongly disagree with near-term AGI predictions. I wrote a long comment here about how AGI's takeover of effective altruism has left me disappointed, disturbed, and alienated. 80,000 Hours and Will MacAskill have both pivoted to focusing exclusively or almost exclusively on AGI. AGI talk has dominated the EA Forum for a while. It feels like AGI is what the movement is mostly about now, so now I just disagree with most of what effective altruism is about. -The extent to which LessWrong culture has taken over or "colonized" effective altruism culture is such a bummer. I know there's been at least a bit of overlap for a long time, but ten years ago it felt like effective altruism had its own, unique culture and nowadays it feels like the LessWrong culture has almost completely taken over. I have never felt good about LessWrong or "rationalism" and the more knowledge and experience of it I've gained, the more I've accumulated a sense of repugnance, horror, and anger toward that culture and ideology. I hate to see that become what effective altruism is like. -The stori
24
22d
1
I've been thinking a lot about how mass layoffs in tech affect the EA community. I got laid off early last year, and after job searching for 7 months and pivoting to trying to start a tech startup, I'm on a career break trying to recover from burnout and depression. Many EAs are tech professionals, and I imagine that a lot of us have been impacted by layoffs and/or the decreasing number of job openings that are actually attainable for our skill level. The EA movement depends on a broad base of high earners to sustain high-impact orgs through relatively small donations (on the order of $300-3000)—this improves funding diversity and helps orgs maintain independence from large funders like Open Philanthropy. (For example, Rethink Priorities has repeatedly argued that small donations help them pursue projects "that may not align well with the priorities or constraints of institutional grantmakers.") It's not clear that all of us will be able to continue sustaining the level of donations we historically have, especially if we're forced out of the job markets that we spent years training and getting degrees for. I think it's incumbent on us to support each other more to help each other get back to a place where we can earn to give or otherwise have a high impact again.
7
8d
Hey! I'm requesting some help with "Actions for Impact", it's a notion page with activities people can get involved in that take less than 30 minutes and can contribute to EA cause areas. This includes signing petitions, emailing MPs, voting for effective charities in competitions, responding to 'calls for evidence', or sharing something online. EA UK has the notion page linked on their website: https://www.effectivealtruism.uk/get-involved  It should serve as a hub to leverage the size of the EA community when it's needed.  I'm excited about the idea and I thought I'd have enough time to keep it updated and share it with organisations and people, but I really don't. If the idea sounds exciting and you have an hour or two per week spare please DM me, I'd really appreciate a couple of extra hands to get the ball rolling a bit more (especially if you have involvement in EA community building as I don't at all). 
8
10d
There are two philosophies on what the key to life is. The first philosophy is that the key to life is separate yourself from the wretched masses of humanity by finding a special group of people that is above it all and becoming part of that group. The second philosophy is that the key to life is to see the universal in your individual experience. And this means you are always stretching yourself to include more people, find connection with more people, show compassion and empathy to more people. But this is constantly uncomfortable because, again and again, you have to face the wretched masses of humanity and say "me too, me too, me too" (and realize you are one of them). I am a total believer in the second philosophy and a hater of the first philosophy. (Not because it's easy, but because it's right!) To the extent I care about effective altruism, it's because of the second philosophy: expand the moral circle, value all lives equally, extend beyond national borders, consider non-human creatures. When I see people in effective altruism evince the first philosophy, to me, this is a profane betrayal of the whole point of the movement. One of the reasons (among several other important reasons) that rationalists piss me off so much is their whole worldview and subculture is based on the first philosophy. Even the word "rationalist" is about being superior to other people. If the rationalist community has one founder or leader, it would be Eliezer Yudkowsky. The way Eliezer Yudkowsky talks to and about other people, even people who are actively trying to help him or to understand him, is so hateful and so mean. He exhales contempt. And it isn't just Eliezer — you can go on LessWrong and read horrifying accounts of how some prominent people in the community have treated their employee or their romantic partner, with the stated justification that they are separate from and superior to others. Obviously there's a huge problem with racism, sexism, and anti-LGBT prejudi
26
1mo
I'd love to see Joey Savoie on Dwarkesh’s podcast. Can someone make it happen? Joey with Spencer Greenberg: https://podcast.clearerthinking.org/episode/154/joey-savoie-should-you-become-a-charity-entrepreneur/
15
1mo
Apply now for EA Global: London 2025 happening June 6–8. Applications close on May 18 at 11:59 pm BST (apply here)! We're excited to be hosting what's shaping up to be our biggest EAG yet at the InterContinental London–The O2. We expect to welcome over 1,500 attendees. We have some travel funding available. More information can be found on the event page and EA Global FAQ. If you have any questions, please email us at hello@eaglobal.org!
Load more (8/135)