Hide table of contents

First post here (long time lurker) and I’m starting off spicy :) While I have several EA friends and generally align with EA ideas, I don’t identify as an effective altruist. Tangent: I think veganism has a branding problem too and I’m only outwardly identifying as a “vegan” because it simplifies interactions like ordering in restaurants; however, I generally don’t need the EA label to facilitate day-to-day life and have successfully avoided it thus far. This post outlines the three biggest gripes I have about the EA community and I think they are turning other people off too.

  1. “It’s a hive mind”[1]

    When talking to EA-aware but not actually EA people (especially in tech), they’ll likely tell you EA feels like a cult. The press on personalities like SBF and Rob Granieri certainly didn’t help, though the EA community does (subconsciously) enforce quite a bit of uniformity in thoughts and actions — everyone generally agrees on the most important causes and the most effective ways to contribute to these causes, so everyone feels obliged to dedicate themselves to these causes in these specific ways. As a more concrete example, I agree AI safety/alignment is existential but I’m just not that interested in working on it despite having the right skills/background; given EA’s current focus on AGI, I feel as if I’m doing something BAD by not dedicating more time to this topic when I talk to a EA member or simply read an EA article. More broadly, “effective altruism” implies if you don’t do things our way, then it’s ineffective/irrational, which can be quite a blow to those who don’t want to feel like a dummy.

  2. “Holier-than-thou”

    This critique is mostly about the 10% pledge (yes, I understand the caveats that it’s not literally 10% for everyone every year). I get that accountability and community around giving is helpful, though I wonder if the orange or blue diamonds are sending the right signals (do we have data on how people hear about the pledge vs. their chance of taking it?). The little icon next to user names in social media is giving “cult” vibes again (think a cross or an astrological sign next to someone’s user name). The bigger problems are beyond virtue signaling: (1) it doesn’t work synergistically with FIRE (I’m planning to FIRE before 30, which is quite extreme, but many EA-curious people I know are also interested in FIRE), (2) it can feel overwhelming, especially to those struggling with mental health or neurodivergence (I’m autistic and can get sensory overloaded by drinking water or eating breakfast, so having my name on a public list and being asked to report my donations all the time for the rest of my life would definitely overwhelm me to the point of deterrence). Personally, my partner and I donate on average ~$10k USD every year (plus employer matching for the most part), which is only ~1% of my income, but saving aggressively allows me to stop working early, use my time however I see fit, and continue giving a similar amount from capital gains. I don’t plan to have kids and want to donate most of my assets to effective charities upon my passing. In fact, wouldn’t it be much easier in general for people to conceptualize and pledge a certain % of their total assets to EA causes upon passing instead of doing it every year? I understand there can be value drift and reduced cost-effectiveness over time, but the steady state of these two approaches (given a large enough pledge base) doesn’t seem too different.

  3. “Sound like AI”

    EA posts can be full of (deep) hierarchical table-of-content and longer-than-necessary content. EA is also associated with obscure (to gen pop) concepts like longtermism, accelerationism, micromorts etc. As mentioned above, I’m autistic (and deeply interested in philosophy), yet I often find reading EA content exhausting. When I talk to my EA friends, they don’t sound like AI-generated academic papers and our colloquial/ less researched exchanges can feel more convincing than reading way too many stats and big words.

I hope I don’t come off too condescending and I’m open to hear counterarguments. Again, I resonate with EA’s core ideas and would love to brainstorm together on how to make them more appealing to the average guy, gal, or non-binary pal. I’d also like to hear your hot takes that may not align with the standard EA stance.

  1. ^

    I'm using quotation marks because I don't fully endorse the connotations these comments carry. I think they’re a bit tongue-in-cheek — not totally fair yet with a kernel of truth worth considering.

26

0
2

Reactions

0
2

More posts like this

Comments18
Sorted by Click to highlight new comments since:

"Personally, my partner and I donate on average ~$10k USD every year (plus employer matching for the most part), which is only ~1% of my income". I think this is where the disconnect comes from. At a ~$1 million/yr income, it seems you are prioritizing early retirement and a luxurious lifestyle over EA causes and giving. That's normal preference expression for the ultra-wealthy. It's just going to seem discordant for many in EA making $50-$150k/yr and giving 10%+ who place (relatively) higher priority on giving. There's a difference between what you value and prioritize and most people in the movement. I'm not trying to make a normative statement; just pointing out a difference that is likely causing the outsider feeling. It's a good thing you're donating, and thinking about how to give effectively.

On the diamonds next to peoples names and holier-than-thou attitude: Having been in the movement a while, I often encounter the cult-like and holier-than-thou perception of EA. Even from from friends and family. The perception usually comes from a deep skepticism that people could be fundamentally motivated by altruism. It's easier to assume that it's either cult brainwashing that implies a loss of rational thinking and agency, a way to feel superior over others, or that is a all a virtue signaling facade for reputational benefit. Knowing many people in the movement - most do have an intrinsic altruistic motivation. That such a motivation could exist is alien, even threatening to many people. I'm not sure what to say about that beyond I hope skeptics can adjust their mental model of the world to include those who genuinely care about making it better.

Hi Matthew, I appreciate you trying to pin-point the root cause of my disconnect, but I'd like to push back a bit: "it seems you are prioritizing early retirement and a luxurious lifestyle over EA causes and giving" - yes to the "early retirement" part and no to the "luxurious lifestyle" part. I'm very frugal even compared to those under poverty line in US/Canada; however, I want to leave a large cushion for emergency since I'm stopping work at such a young age. If I have more intrinsic desire to work, I'd be much more open to work a normal 80k-hr career and donate a larger % of my income at an ongoing basis. 

"Knowing many people in the movement - most do have an intrinsic altruistic motivation." - I don't find it unbelievable that people have an intrinsic altruistic motivation and I respect the EA community for their service. My critique is more so that EA seems to be focused on "how to" for the in-group (those already super into the EA concept) and less so on outreach to near-groups, let alone out-groups. In other words, how can EA turn 90% of people into semi-altruists instead of turning 1% of people into perfectly effective altruists? I think the 10% pledge in its current structure isn't very appealing to 90% of people.

I know the FIRE community floats 25x your annual spend in savings as a target for retirement. At your income and "frugal even compared to those under poverty line", it would take you less than year to hit that target. Taking what you say as true, it means you are prioritizing one less year of working far more than altruistically helping others. That is discordant with the median attitude in the community, who imagine themselves working effectively half a decade or more solely for the benefit of others. I don't want to focus too much on the money. Its the relative self vs. others prioritization. That's a tension that is always going to exist between the FIRE community and the EA community.

As for outreach, that's been studied: https://forum.effectivealtruism.org/posts/r8XoHhKKzmQgxm2Lf/ea-survey-2024-how-people-get-involved-in-ea#The_effect_of_outreach Most people find EA on their own. Having been an organizer in several groups and given lots of EA presentations, I found active outreach to be unproductive. The D.C. group once held a 800+ person heavily-advertised Peter Singer-headlined event that resulted in just 1 new person coming to the next meetup who didn't come back. My group now routinely gets new people passively who heard about it on a podcast, through 80k, or through the vegan community. The movement's utilitarianism and universalism across time, place, and species doesn't fit well with personal value systems based on justice/prioritarianism, self-interested libertarianism, or racial/cultural/religious tribalism. Altruism is (unfortunately) rare, and it's easier for those people to find the EA community than for the EA community to find them.

"Its the relative self vs. others prioritization. That's a tension that is always going to exist between the FIRE community and the EA community." - I agree with this statement. However, do you think the selfish vs. altruistic trait is a bi-modal or a normal-ish distribution? My intuition is the latter, that most people want to do some good but are also somewhat selfish.

This actually leads into outreach strategy. I'm not a community organizer but I know it's hard work, so kudos to you for doing the meta-work. I want to challenge the "success metric" for the outreach. It sounds like you're using "who's coming to a EA meetup" as a proxy. In my opinion, the real beneficiaries of the EA movement are other people, animals, and perhaps future non-biological sentient beings. So I think a better proxy metric would be something like "how much money donated to these beneficiaries" - one doesn't need to attend any EA meetup or know about this forum to donate to, e.g., the Humane League. Anecdotally, I have a few "self-interested libertarian" friends and I was able to convince them to donate to the Shrimp Welfare Project recently. 

True/pure altruism is indeed rare but I believe most people are at least semi-altruistic. They (perhaps including me) may not be a good fit for the core EA community but they're open to support EA causes.

For what it's worth, Peter Singer's organization The Life You Can Save has a donation pledge that adjusts the percentage based on your income. You can type in your income and will give you a percentage back. At $10,000, it's 0%. At $50,000, it's 1%. At $100,000, it's 1.8%. At $500,000, it's 10%. And at $1,000,000, it's 15%. 

So, this pledge is less demanding than the Giving What We Can pledge and, also, nobody is saying you have to take either pledge to be a part of EA. 

Most people on the EA Forum don't seem to have the little blue or orange diamonds next to their usernames. Probably at least a few just haven't added a diamond even though they haven't taken the Giving What We Can pledge, but as far as I know, a lot of people genuinely haven't taken it. Maybe even the majority, who knows. When I ran an EA group at my university, I think at least about half of the regular, active members didn't take the GWWC pledge, and I'd guess it was probably more than half. (It was a long time ago, and it's not something we kept track of.)

In my personal experience with EA, I've never seen or heard anyone say anything like, "You should/need to take the pledge!" or "Why haven't you taken the pledge yet?" I've never seen anyone try to give someone the hard sell for the GWWC pledge or, for that matter, even try to convince them to take it at all. 

Personally, I'm very much a proponent of not telling people what to do, and not trying to pressure people into doing anything. My approach has always been to respect people's autonomy and simply talk about why I donate, or why I think donating in general is a good idea, to the extent they're curious and want to know more about those things. 

I think where Matthew's comments resonate is just that it's hard to understand how your math checks out. For example, the average lifetime earnings of Americans with a graduate degree (which is significantly higher than for all other educational cohorts, including those with only bachelor's degrees) from age 20 to 69 is $3.05 million (adjusted for inflation from 2015, when this data was collected, to 2025). If you're earning around $1 million a year, then within about 3 years at that income level, your lifetime earnings will match the average lifetime earnings of Americans with a graduate degree. It's hard to square the idea that you only want to live a frugal lifestyle, comparable to someone around the U.S. poverty line, or even the lifestyle equivalent to someone with U.S. median income with the idea that you earn around $1 million a year and that donating 10% of your income is too demanding, even accounting for the fact that you want to retire extremely early. 

And retiring before age 30 is itself a sort of luxury good. Even if donating 10% of your income would cause you to overshoot your goal by, say, 2 years and retire at age 31 instead of age 29, is that really a flaw in the concept of donating 10% of your income to help the world's poorest people or animals in factory farms? If it is correct to think of extremely early retirement as a kind of luxury good, then is it all that different for someone to say the 10% pledge asks too much because it would require them to retire at 31 instead of 29 than it would be for someone to say the pledge asks too much because they want to buy a $600,000 Lamborghini? I'm not passing judgment on anyone's personal choices, but I am questioning if it's a valid criticism of the GWWC pledge that it might be incompatible with some people acquiring certain luxury goods reserved for the wealthiest 1% of people in high-income countries. So what if it is? Why is that a problem? Why should people in EA want to change that?

But in any case, it's up to you to decide what percentage you want to donate out of your current income or your investment income after you retire early. If 10% is too onerous, you can donate less than 10%. You could put whatever you expect your income during retirement to be in The Life You Can Save's calculator and see if you think that would be an amount you'd be comfortable giving after you retire. Every additional dollar donated is a better outcome than one dollar less than that being donated. So, just think about what you want to donate, and donate that.

People in EA already do tend to think in marginal terms and to wonder what the equivalent of the Laffer curve for effective altruism might be. Nobody has ever gotten this down to an economic science, or anything close, but it's something people have been thinking about and talking about for a long time. My general impression is that most people in EA have been very open to people coming into EA with various levels of commitment, involvement, or donating. 

The only real counterexample I can think of this is when one person who has since (I believe) disassociated themselves from EA argued in defense of the parent organization of the Centre for Effective Altruism purchasing Wytham Abbey. Their argument was that it's all the better if normal people find this repugnant, since it signals (or countersignals) that EA has weird ideas and morals, and this helps attract the weird people that EA needs to attract to, I don't know, solve the problems with technical AI alignment research and save the world from an imminent apocalypse and usher in a post-scarcity utopia. I find this ridiculous and quite a troubling way to think, and I'm glad most people in EA seem to disagree with this view on the Wytham Abbey purchase, and with this kind of view in general about signaling (or countersignaling) correctly so as to attract only the pure minds EA needs.

Maybe there's still some of that going around, I don't know, maybe there's a lot of it, but somehow I get the impression that most people in EA aren't into gatekeeping or purity of that kind. On the other hand, I'm only really thinking here about joining the movement at the entry level, and if you want a job at an EA organization or something like that, people will probably start to gatekeep and apply purity tests. 

In other words, how can EA turn 90% of people into semi-altruists instead of turning 1% of people into perfectly effective altruists? I think the 10% pledge in its current structure isn't very appealing to 90% of people.

Looking at successful non-EA social movements, I suspect that endorsing a multi-level approach rather than moving away from higher-commitment organizations would be the right move. Think of Christianity (or probably other religions, I just know Christianity better) -- you have the option of full-in commitment as a monk or a run, but also significantly lower-commitment options to appeal to the larger population. That doesn't mean moving away from offering medium- or higher-commitment options, though.

I don't think there is a great understanding of why EA has been relatively unsuccessful at reaching broader populations at more modest commitment levels. I think it is in part a cultural issue, but I don't think that's all of it.

The press on personalities like SBF and Rob Granieri certainly didn’t help

(As a datapoint, I had no idea what Rob Granieri was before reading this post, and I'm probably not the only one because he doesn't seem to have ever been mentioned here before.)

Welcome to the Forum, Zoe! I guess my knee-jerk response to this would be that I agree that these are significant problems with EA branding, I don't think most of these have easy, tractable answers (sadly a common occurrence in EA branding problems imo, eg "longtermism" being perceived as callous toward present-day issues). 

"Hive mind" seems hard to avoid when building a community of people working toward common goals that are somewhat strange. "Holier-than-thou" is almost inevitable in "doing the most good one can" (and EA seems in fact quite relaxed by this standard, though your specific criticisms of the 10% pledge were interesting to read). "Sounds like AI", however, is probably fixable, and individuals could make some efforts, in the age where "AI-like writing" is increasingly criticized, to have a slightly warmer style, and maybe to de-emphasize bulletpoints somewhat? (less sure about this, I like bulletpoints)

But above all, I want to say, congratulations on your yearly donations! Even if it's not the holy grail of 10%, 10K a year is absolutely no joke, and giving 10% is far from having become an EA norm anyway. This level of donations, and the plan to keep going, is rare and precious. Thank you for doing so much for others!

Hi JoA - I see you're interested in animal welfare and invertebrate welfare, so I just want to say these are causes I consistently prioritize in my donations. I'm also looking into the AI x animal welfare space (I mentioned I'm not generally interested in AI safety but I am interested in applications/implications on animals) and my preliminary takeaway is that AGI doesn't substantially change how we should approach animal welfare in the near term (i.e., what helps animals now/soon will likely continue to help them with AGI, likely more). 

Hi Zoe! It's thrilling to meet others with interest in invertebrate welfare (doesn't happen every day), and congratulations again for donating to a cause that is rarely considered appealing! Unsurprisingly, there's really no consensus on what one should do for animals in the face of AGI. However, there's a lot of exchange around what AI could mean for animals on the Sentient Futures slack, and if you have some thoughts you want to share about this, I'm sure there are many members there (including me) who'd be happy to read your current takes on the topic! 

Hi Zoe. I'm glad you've crossed over from lurking to participating. I gave this post an upvote even though I disagree with a lot of it, even though I wanted to agree. I agree with this part:

the EA community does (subconsciously) enforce quite a bit of uniformity in thoughts and actions — everyone generally agrees on the most important causes and the most effective ways to contribute to these causes

The conformity is way too high, and the level of internal agreement is way too high/lack of internal disagreement is way too low.

When I was involved in organizing my university EA group, one conversation we had was about the value of art. Someone in our group talked about a novel she had found important and impactful. Can we really say that anti-malarial bednets are more important than art? I think a lot of people in EA feel (and, indeed, in our EA group at the time felt) a temptation to argue back against this point. But there's a more intriguing and more expansive conversation to be had if you don't argue back, take a breath, and really consider her point. (For example, have you considered the impact sci-fi has had on real life science and technology? Have the considered the role fiction plays in teaching us moral lessons? Or in understanding emotions and relationships, which are what life is all about?)

I think, in general, it's way more interesting to have a mix of people with diverse personalities, interests, and points of view, even when that means sometimes entertaining some off-the-wall ideas. (I don't think what that person said about art was off-the-wall at all, but talk to enough random people about EA online or in real life and you'll eventually hear something unexpected.)

This is the part of your post I have the hardest time with:

I wonder if the orange or blue diamonds are sending the right signals (do we have data on how people hear about the pledge vs. their chance of taking it?). The little icon next to user names in social media is giving “cult” vibes again (think a cross or an astrological sign next to someone’s user name).

Is the little orange or blue diamond so different from someone having an emoji in their username, or, in real life, wearing a little pink or red ribbon for breast cancer of HIV/AIDS awareness? I have a hard time relating to your perspective because if on Twitter or wherever I saw someone put a cross or an astrological sign next to their name, I think I would just assume they are religious or really into astrology. I wouldn't find it particularly scary or cult-y.

Personally I wish the EA Forum had more ways to zhuzh up how your username appears on posts and comments. The little diamonds are the only bit of colour we get around here. 

Full-on profile pictures embedded in posts and comments might be too distracting, but I don't know... coloured usernames? Little badges to represent things like your country, your favourite cause area, or your identity (e.g. LGBT)? I find one advantage of having something like this is not just the zhuzh but also it makes it easier to remember who's who rather than having to memorize everyone's names. The little blue and orange diamonds already help a bit with this.

[Edit: I decided to zhuzh up my username with emojis because it looks ridiculous but also kinda cute and it really made me laugh. Lol.]

having my name on a public list and being asked to report my donations all the time for the rest of my life would definitely overwhelm me to the point of deterrence

Is this really what Giving What We Can asks you to do these days? I took the 10% pledge back in 2008 or 2009. I have no idea if my name is still on a public list and I don't think I have ever once reported my donations. I can empathize with hating the administration burden part of it because I really struggle with admin tasks of all kinds (I think a lot of people do) and I find a lot of admin stuff miserable and demoralizing. 

I guess the point of reporting your donations is so that GWWC can say how much money people are donating as part of this movement, but obviously that's of secondary importance (a very, very distant second) to actually donating the money. I always saw the 10% pledge as a personal, spiritual commitment and not a promise I made to anyone else. Nor as something I was obligated to report. It's a reminder to myself of what my values are: "hey, remember you said you were going to do this??"

So, if you feel you want to do the pledge but don't want to do the admin, just do the pledge and don't do the admin. :)

In fact, wouldn’t it be much easier in general for people to conceptualize and pledge a certain % of their total assets to EA causes upon passing instead of doing it every year?

Would it be? You'd be asking people to think about dying, which isn't easy. Also, you'd be asking them to write a will, which is a lot of admin! 

Also, if the average person who is interested in EA is 38 years old — which is Will MacAskill's age — and their average life expectancy is 80, doesn't that mean no one would donate anything to charity for, on average, the next 42 years? And wouldn't that be really bad? 

I think your idea of donating a percentage of your passive income from capital gains to charity after you retire early is perfectly fine — that's just donating a percentage of your income, which is the whole idea in the first place. Maybe you'll want to donate less than 10% and that's fine too. 

I think everyone should find what works for their particular situation. The 10% pledge is formulated to be something that could apply to the majority of the population in high-income countries, but not something that necessarily makes the most sense for everyone in those countries. 

“Sound like AI”... When I talk to my EA friends, they don’t sound like AI-generated academic papers...

"Sounds like AI" is the wrong way to put this. Posts on the EA Forum don't sound like AI. They have a distinct voice that is different from ChatGPT, Claude, or Gemini. LLMs have a distinctive bland, annoying, breathless, insubstantial, and absolutely humourless style. The only thing really similar to the EA Forum style and LLM style is the formal tone. Maybe EA Forum posts sound like academic papers, but they don't sound like AI-generated academic papers.

I know because I've read a lot of stuff on the EA Forum and a lot of stuff written by AI. I can really tell the difference.

EA is also associated with obscure (to gen pop) concepts like longtermism, accelerationism, micromorts etc. ... When I talk to my EA friends... our colloquial/ less researched exchanges can feel more convincing than reading way too many stats and big words.

This is more accurate. EA/the EA Forum has its own weird subculture and sublanguage and it's pretty annoying. People use lingo and jargon that isn't useful or clear, and sometimes has never even been defined — I hate the term "truthseeking" for this reason, what does it mean? (As far as I know, it's literally never been defined, anywhere, by anyone. And it's ambiguous. So, why is that term helpful or necessary?) People assume too much background knowledge and don't explain things in an accessible way, which wouldn't just help newcomers, but would also help everyone. 

What you said about casual, informal conversations with your EA friends being more persuasive is an argument in favour of people in EA having more casual, informal conversations on the EA Forum, or on podcasts, or whatever. Before I read your post, I already had the intuition that this would be a good idea. 

I want to suggest to everyone the concept of doing public dialogues on the EA Forum, following the model of the Slack chats that FiveThirtyEight used to do on their blog. The FiveThirtyEight staff would pick a topic, chat about it on Slack, and then do some light editing (e.g. to add links/citations). Then they'd publish that on their blog. I think this could work really well for the EA Forum. You could either do the chat in real time (synchronously) or take time doing it (asynchronously). But I think it would be more fun if people didn't spend too much time writing each message, and if they tried to be more casual and informal and conversational than EA Forum posts typically are. I just have a hunch that this would be a good format. (And anyone can message me if they want to try this with me.)

In terms of length, personally, I'm not as concerned with how long something is as I am with its economy of words. I don't like when things are long and they're longer than they could have been. If something's long but it's still as short as it could have been, that's great. (That's why books exist!!) If something's long and I feel like it could have been 20% of its length, that's a huge drag. If something's short but it makes a complete point and says everything it really needs to say, that's like a delightful piece of candy. I love reading stuff like that. But not everything can be candy. (And if we feel like it should be, maybe we can blame Twitter for conditioning us to want everything to be said in 140-280 characters.)

What makes something feel longer or shorter is also how enjoyable it is to read, so it's also a matter of craft and style. 
 

Hi Yarrow - thank you for taking the time to reply so thoroughly! I love your new emoji flair.

talk to enough random people about EA online or in real life and you'll eventually hear something unexpected

Yes, I always enjoy talking to people about EA and usually find more diversity in thoughts when people converse in real life than writing online (perhaps this is because of in-group vs. out-group: while most causes have more in-fighting among the in-group, the EA community seems to have dodged this problem at the expense of in-group high conformity, but this constraint can be relaxed when talking to the out-group irl).

Is the little orange or blue diamond so different from someone having an emoji in their username, or, in real life, wearing a little pink or red ribbon for breast cancer of HIV/AIDS awareness?

I have no issues with people advertising their identity or interest or quirkiness with some kind of flair. I used a cross and astrological signs as an analogy because, like the orange diamond, they convey a sense of superiority like "I will be saved (while non-believers go to hell)" or "I'm in tune with the cosmos (while low-vibration people slave through life)". I acknowledge these are stereotypes and not everyone uses these symbols with the same judgmental intention.

On the 10% pledge itself, my point is not that no one should donate now, but that the "official" EA pledge has too rigid of a structure. To me, the vibes are similar to most vegan activists telling people "going vegan" is the only right way and hating on vegetarians or reducetarians because they're not meeting the standard. I think for most causes, two things are true at the same time: (1) people who have hardly thought about it should do way more about it and (2) people who have thought much about it, besides directly addressing the cause, should probably focus on effective outreach to people in (1) but chill on policing others in (2).

"Sounds like AI" is the wrong way to put this.

You're right, the point should be more about sounding too formal/academic rather than sounding too AI. I often use AI to help me polish my writing, so I tend to associate AI writing with more structure and more serious tone, but this is a usage bias. I think doing public dialogues on the EA Forum is a great idea.

I guess you can put a lot of meaning into a little symbol. I wouldn’t interpret a cross or an astrology sign as conveying a sense of superiority, necessarily, I would just think that person is really into being Christian or really into astrology. 

If you see someone wearing a red ribbon relating to HIV/AIDS, I guess you could have the Curb Your Enthusiasm reaction of: “Wow, so they’re trying to act like they’re so much better than me because they care so much about AIDS? What a jerk!” Or you could just think, “Oh, I guess they care about AIDS for some reason.”

I’ve never perceived anyone to be using the little blue and orange diamond icons to signal superiority. I interpret it as something more supportive and positive. It’s reassuring to see other people do something altruistic so you don’t feel crazy for doing it, and making a sacrifice feels more bearable when you see other people doing it too. (Imagine how different it would feel if when you donated blood, you did it completely alone in an empty room vs. seeing lots of other people around who are giving blood at the same time too.)

I’ve never observed anyone trying to police someone over donating 10% of their income, or trying to pressure them to take the pledge, or judging them for not taking it. For all I know, that has happened to somebody somewhere, I’ve just never seen it, personally. 

I would say don’t worry too much about the 10% income pledge and just focus on whatever amount of donating or way of donating makes sense for you personally. 

I would be concerned about people deciding to delay their donating by 40-50 years (or whatever it is), since there are probably huge opportunity costs. I hope that in 40-50 years all the most effective charities are way less cost-effective than the most effective charities today because we will have made so much progress on global poverty, infectious diseases, and other problems. I hope malaria and tuberculosis aren’t ongoing concerns in 40-50 years, meaning the Against Malaria Foundation wouldn’t even exist anymore — mission accomplished! But you said you’re already donating about 1% of your income every year, so you’re not holding off completely on donating. 

Hi Zoe! Great to see you on the Forum.

I know that EAs and FIRE people often hang out in similar social spaces - as they're both interested in things like higher-paying jobs, lower-cost lifestyles, and well-chosen investment strategies. I also know of a few FIRE people who plan on donating their wealth to effective charities should they not end up spending it all in retirement. I believe the rough opinion of EAs who donate now is that there are a number of pressing near-term issues that will (hopefully!) not exist in 40 years. One might hope that we'd have eradicated malaria by then, for example. So people who are really interested in fighting malaria (or other neglected global health challenges) donate now.

I think the choice for an effective giver to pledge or not to pledge, to advertise or not to advertise one's giving, to donate now or later etc. is really an individual's choice, and I'd support them setting up a structure that works for them and their life. It seems like you have a structure that works for you, and I'm glad of that.

On a separate note: I'm hoping that EA is getting away from being "AI safety long words club" as a result of a) AI safety group organising spinning out of EA into its own thing, and b) funded AI safety work becoming more about public and government engagement rather than fairly niche research.

I have mostly decided that I don't care what anyone else thinks about the diamond next to my name, because anyone else that could be thinking of this is probably not someone in danger of dying from a neglected tropical disease. There are probably a few people in my life that would be actively upset if they were aware I was giving away thousands of pounds a year but not to them. I don't go out of my way to wave it in their face that that's what I'm doing, that'd be crass. But I am really rather over the idea that we all need to pretend that we don't have any money spare for giving while at the same time socking out piles of cash on personal spending.

My hottest take is that EA needs to figure out better engagement strategies for effective givers and not constantly dunk on "earning to give" as a controversial concept. And also make its large events cheaper (there's been great progress on this). I reckon EA needs to prepare for the very real possibility/eventuality of our per-person infrastructure funding getting restricted.

Hi Kestrel - thank you for your thoughtful reply and I agree with your hot takes :) There're definitely similarities between FIRE and EA communities as both agree that a certain level of wealth/comfort is enough, but the trade-off is that the more one gives in the near term, the further out this person can FIRE with the same cushion for emergency and/or the less continuous giving this person can draw from investments. I'm totally in support of letting effective givers figure out what works for them. My point is not that no one should donate now, but that the "official" EA pledge has too rigid of a structure.

In fact, wouldn’t it be much easier in general for people to conceptualize and pledge a certain % of their total assets to EA causes upon passing instead of doing it every year? 

It might be easier to conceptualize, but it might not be particularly impactful.

(I'm going to use numbers close to US medians. The 10% Pledge was created by middle-class people, and I think its middle-class accessibility is an important part of keeping EA at least mildly "democratic.")

Suppose a 25-year old ("Jane Doe") will have total earnings of $3M over a 40-year period in current dollars (averaging $75,000 per year in current dollars). The the national average wage index is about $69K nowadays, but round numbers are easier to work with.

So the value of what the pledge asks of Jane is $300K in current dollars. That's a significant ask, to be sure, and much ink has been spilled about whether it the right amount. It probably is a bit too much for Jane, but then again I expect that the average person with openness to the 10% Pledge has above-average earnings capacity. Although I would prefer some income gradation to the Pledge, the people who run it have emphasized simplicity, and it doesn't sound like a graduated scale would address your concern anyway.

I question whether Jane's estate will have $300K in current dollars in it when she dies. The US median household wealth was $167K in 2021, mostly home equity. Older workers will have more, but there's also dissaving in retirement. The average inheritance conditioned on receiving an inheritance is $184K according to this (also 2021). 

Even if we're generous and assume Jane will have $300K in current-dollar assets when she dies, the "certain % of their total assets" would have to be 100% to match the commitment level of the current pledge. For at least most people with kids -- and if Jane is statistically common, she will have at least one child -- that is a very hard sell. If the ask was (say) 25% of assets at death, the pledge might be only 12-25% as impactful as the 10% Pledge.

So, at a minimum, I think one would need to compare an equal-sacrifice version of "% of income" vs. "% of assets at death." 

A few other observations:

  • The balancing act, I think, is minimizing a "holier than thou" attitude while making an implied moral claim effectively. Any solution involves some degree of tradeoffs.
  • When most people sign the Pledge, I think they can be seen as impliedly asserting something. That is approximately: People similarly situated to me should give away a meaningful amount of their income, where meaningful is an amount whose absence is clearly noticed but is not onerous. It's inevitable that making this kind of assertion is going to be controversial.
  • The signaling value of having a community of people pledging to make donations when they die is much weaker than having a community of live givers. It's not the pledging that conveys moral weight so much as the doing. It's very easy to dismiss mere pledges without action as performative.
  • I think the claim-making power of giving is lower when one gives only (or predominately) at death for a different reason as well. The universe has already decided to divest you of your money at that point. So in some sense, the donor who gives at death is spending (what would counterfactually be) other people's money. I take -- and I think many people take -- the moral claims of people who opine about how to spend other people's money much less seriously than the claims of people who spend their own money on a cause.

Hi Jason - thanks for spelling out the details here. I agree with your observations. I'm not proposing to eliminate the 10% pledge and am more "endorsing a multi-level approach" as you said in your other comment. I acknowledge it's (1) easier to let go of "(what would counterfactually be) other people's money" and (2) it'd likely be a lesser sum than lifetime giving, but it may be a good lower-commitment option. Gen Z are having fewer kids (which is reflected in my friend group, where especially high-income women are not planning to have any kids) and people tend to think more about their legacy in old age, so this proposal could be appealing. To be clear, I think people who take the 10% pledge are super respectable and the trial pledge is a good way to get people on the fence in the door. Giving What We Can already added the wealth-based option and the Further Pledge as newer options and I wonder if it's more accessible if it gives a whole list of options (As another example, X% of your inheritance/trust - I know many trust fund babies that would be happy to do something, anything, with their money. Again, it doesn't carry the weight of an average middle-class person giving 10% every year, but it would be a lot more money funneled into effective causes rather than naming another building at Harvard after themselves.) and tweak the messaging to be something like "While we encourage people to take the 10% pledge and have a trial pledge that you can easily get started with, we also realize there are many other possible giving options that you prefer to pursue, for example: [insert list of options]. Should you choose not taking the 10% pledge, we still urge you to consider directing your donations towards effective charities."

I suppose I'm really making two main points here:

  • More mass appeal (both to the not-so-well-off and to the wealthy people) without giving up the full-commitment option
  • Emphasis on the "effective" part more than the "10%" part to the mass while affirming/not diluting the meaning of the 10% pledge for those choosing the full commitment
More from Zoe L
Curated and popular this week
Relevant opportunities