One criticism EA gets all the time is that we're coldhearted borg-like cost benefit-obsessed utility maximizers. Personally, I like that about EA, but I see huge value in being, and being perceived as, warm and fuzzy and hospitable.

Over at LessWrong, jenn just wrote an insightful post about her top four lessons from 5,000 hours working at a non-EA charity: the importance of long-term reputation, cooperation, slack, and hospitality.

Here, I am proposing a modification to the EA norm of a 10%-of-income annual donation to an EA-aligned/effective charity. We should modify that standard to promote donating 8% of income to EA-aligned/effective charities, and 2% to charities that are local, feel-good, or something we're passionate about or identify with on a personal or cultural level.

As an example, if you make $80,000/year, you might consider donating $6,400 to Givewell and $1,600 to the local food bank. If you work as an employee of an EA-aligned organization (so 40 hours of direct work per week), you might consider doing 4-5 hours/week of volunteering to help the homeless.

Here are some reasons why I think this is a good idea:

  • The average American donates about 2% of their income to charity. This new standard means that the 8% we'd donate to EA causes is over and on top of the amount most people donate. That means EA is less likely to be perceived as clawing away donors from other charities in a zero-sum charity competition. Instead, it's encouraging people to donate more - growing the pie.
  • It makes EA friendlier and more cooperative with value systems that are different from our own.
  • It boosts our reputation with people in our social and cultural network.
  • It gives participants in EA an outlet to get their need for warm-and-fuzzy feelings met.
  • It gives a perception of slack - instead of EA being associated with a sort of stringent "no room for compromise, the stakes are too great" perspective, EA can project the "there is so much good we can do in the world" message that we actually mean, in a way that connects symbolically for the average person who's not an EA.
  • It makes it possible to tell a combination of stories about the work we do in the world. Taking action locally for the good of our own community is often easier to see and feel and talk about at the dinner table than giving anonymous-feeling donations to global health institutions or X-risk research groups.

If you prefer, you could simply add 2%-of-income on top of the 10% Giving What We Can pledge, or do whatever combination makes sense for your situation. In fact, I think it's probably best if we treat 2%/8% as a rough anchoring benchmark, while encouraging people to pick the blend that makes sense to them. Encouraging more individual choice and less adherence to a potentially rigid-seeming rule, while still having an anchoring point so the commitment means something, seems good for EA.

If we adopt this standard, I suggest we find additional ways to frame it besides the coldhearted-sounding rules-and-percentages manner I'm describing it here. Rather than "we advocate giving 2% locally and 8% to effective charities, mainly for perceptions reasons," I would suggest explaining this rule with a qualitative and friendly-sounding statement like "we try to mix our donations and efforts to help our local communities while also working on the world's biggest problems."

4

0
1

Reactions

0
1

More posts like this

Comments9


Sorted by Click to highlight new comments since:

I disagree. In particular:

  1. Roughly, I think the community isn't able (isn't strong enough?) to both think much about how it's perceived and think well or in-a-high-integrity-manner about how to do good, and I'd favor thinking well and in a high-integrity manner.
  2. I'd guess donating for warm fuzzies is generally an ineffective way to gain influence/status.

(Of course you should be friendly and not waste weirdness points.)

Roughly, I think the community isn't able (isn't strong enough?) to both think much about how it's perceived and think well or in-a-high-integrity-manner about how to do good, and I'd favor thinking well and in a high-integrity manner.

Just want to flag that I completely disagree with this, and that moreover I find it bewildering that in EA and rationalism this seemingly passes almost as a truism.

I think we can absolutely think both about perceptions and charitable effectiveness - their tradeoffs, how to get the most of one without sacrificing too much of the other, how they might go together - and both my post here and jenn's post that I link to are examples of that.

People can think about competing values and priorities, and they do it all the time. I want to have fun, but I also want to make ends meet. I want to do good, but I also want to enjoy my life. I want to be liked, but I also want to be authentic. These are normal dilemmas that just about everybody deals with all the time. The people I meet in EA are mostly smart, sophisticated people, and I think that's more than sufficient to engage in this kind of tradeoffs-and-strategy-based reasoning.

I'd guess donating for warm fuzzies is generally an ineffective way to gain influence/status.

As a simple and costless way to start operationalizing this disagreement, I claim that if I ask my mom (not an EA, pretty opposed to the vibe) if she'd like EA better with a 2%/8% standard, she'd prefer it and say that she'd think warmly of a movement that encouraged this style of donating. I'm only sort of being facetious here - I think having accurate models about how to build reputation for the movement are important and that EAs need a way to gather evidence and update.

Just flagging that I disagree with the language that EAs "should" donate 10% (in the sense that it's morally obligatory). I think whether or not someone donates is a complicated choice, and a norm of donating 10% a) sets a higher bar of demandingness than I think makes sense for inclusion in EA, and b) isn't even necessarily the good-maximizing action, depending on personal circumstances (e.g., some direct workers may be better off spending on themselves and exerting more effort on their work).

Sorry to be pedantic, but I think it's really easy for these sorts of norms to accidentally emerge based on casual language and for people to start feeling unwelcome.

I think donating at least 10% of one's income per year should be a norm for any person who identifies as part of the EA community, unless doing so would cause them significant financial hardship.

The whole point of EA is to actually do altruism. If someone's not doing direct work, has been going to EA meetups for a year, identifies as an EA, and doesn't at least have stated plans to donate, what makes them EA?

Even EAs who are doing direct work, I would argue, should still donate 10% unless that would cause them significant financial hardship.

What happened to the lesson of the drowning child?

My post is related to the Giving What We Can pledge and the broad idea of focusing on "utilons, not fuzzies." From the wording of your comment I'm unclear on whether you're unfamiliar with these ideas or whether you are just taking this as an opportunity to say that you disagree with them. If you don't think that standards like the GWWC pledge are good for EA, then what do you think about the 2%/8% norm I propose here as a better alternative, even if far suboptimal to no pledge at all?

I don't think taking the GWWC pledge should be a prerequisite to consider yourself an EA (which, it's not a prerequisite now). If your post had said "GWWC members should..." or "EAs who donate 10% should..." instead of "EAs should..." then I wouldn't have disagreed with the wording.

That makes sense. I don't think there are any official prerequisites to being an EA, but there are community norms. I think the GWWC pledge (or a direct-work equivalent) is a common-enough practical or aspirational norm that I'm comfortable with eliding EA and GWWC-adjacent-EA for the purposes of this post, but I acknowledge you'd prefer to split these apart for a sensible reason.

Thanks for the post! I think there's a more effectiveness-oriented version of your recommendation which would still accomplish your recommendation's goals while maintaining greater fidelity to the message of the importance of effectiveness.

2% could go to feel-good effective charities that you can talk to your mom about, like GiveWell recommended charities, the Humane League, or Mercy for Animals. 8% can go to the EA cause areas less palatable to most people like AI sentience, shrimp welfare, AI safety, wild animal welfare, etc.

At least for me, I would just feel like I was shirking on my moral duty if I was donating a significant amount to an obviously less cost-effective charity. I would feel like I was putting my warm fuzzies over helping others.

Curated and popular this week
 ·  · 7m read
 · 
This is a linkpost for a paper I wrote recently, “Endogenous Growth and Excess Variety”, along with a summary. Two schools in growth theory Roughly speaking: In Romer’s (1990) growth model, output per person is interpreted as an economy’s level of “technology”, and the economic growth rate—the growth rate of “real GDP” per person—is proportional to the amount of R&D being done. As Jones (1995) pointed out, populations have grown greatly over the last century, and the proportion of people doing research (and the proportion of GDP spent on research) has grown even more quickly, yet the economic growth rate has not risen. Growth theorists have mainly taken two approaches to reconciling [research] population growth with constant economic growth. “Semi-endogenous” growth models (introduced by Jones (1995)) posit that, as the technological frontier advances, further advances get more difficult. Growth in the number of researchers, and ultimately (if research is not automated) population growth, is therefore necessary to sustain economic growth. “Second-wave endogenous” (I’ll write “SWE”) growth models posit instead that technology grows exponentially with a constant or with a growing population. The idea is that process efficiency—the quantity of a given good producible with given labor and/or capital inputs—grows exponentially with constant research effort, as in a first-wave endogenous model; but when population grows, we develop more goods, leaving research effort per good fixed. (We do this, in the model, because each innovator needs a monopoly on his or her invention in order to compensate for the costs of developing it.) Improvements in process efficiency are called “vertical innovations” and increases in good variety are called “horizontal innovations”. Variety is desirable, so the one-off increase in variety produced by an increase to the population size increases real GDP, but it does not increase the growth rate. Likewise exponential population growth raise
 ·  · 4m read
 · 
TLDR When we look across all jobs globally, many of us in the EA community occupy positions that would rank in the 99.9th percentile or higher by our own preferences within jobs that we could plausibly get.[1] Whether you work at an EA-aligned organization, hold a high-impact role elsewhere, or have a well-compensated position which allows you to make significant high effectiveness donations, your job situation is likely extraordinarily fortunate and high impact by global standards. This career conversations week, it's worth reflecting on this and considering how we can make the most of these opportunities. Intro I think job choice is one of the great advantages of development. Before the industrial revolution, nearly everyone had to be a hunter-gatherer or a farmer, and they typically didn’t get a choice between those. Now there is typically some choice in low income countries, and typically a lot of choice in high income countries. This already suggests that having a job in your preferred field puts you in a high percentile of job choice. But for many in the EA community, the situation is even more fortunate. The Mathematics of Job Preference If you work at an EA-aligned organization and that is your top preference, you occupy an extraordinarily rare position. There are perhaps a few thousand such positions globally, out of the world's several billion jobs. Simple division suggests this puts you in roughly the 99.9999th percentile of job preference. Even if you don't work directly for an EA organization but have secured: * A job allowing significant donations * A position with direct positive impact aligned with your values * Work that combines your skills, interests, and preferred location You likely still occupy a position in the 99.9th percentile or higher of global job preference matching. Even without the impact perspective, if you are working in your preferred field and preferred country, that may put you in the 99.9th percentile of job preference
 ·  · 4m read
 · 
Sometimes working on animal issues feels like an uphill battle, with alternative protein losing its trendy status with VCs, corporate campaigns hitting blocks in enforcement and veganism being stuck at the same percentage it's been for decades. However, despite these things I personally am more optimistic about the animal movement than I have ever been (despite following the movement for 10+ years). What gives? At AIM we think a lot about the ingredients of a good charity (talent, funding and idea) and more and more recently I have been thinking about the ingredients of a good movement or ecosystem that I think has a couple of extra ingredients (culture and infrastructure). I think on approximately four-fifths of these prerequisites the animal movement is at all-time highs. And like betting on a charity before it launches, I am far more confident that a movement that has these ingredients will lead to long-term impact than I am relying on, e.g., plant-based proteins trending for climate reasons. Culture The culture of the animal movement in the past has been up and down. It has always been full of highly dedicated people in a way that is rare across other movements, but it also had infighting, ideological purity and a high level of day-to-day drama. Overall this made me a bit cautious about recommending it as a place to spend time even when someone was sold on ending factory farming. But over the last few years professionalization has happened, differences have been put aside to focus on higher goals and the drama overall has gone down a lot. This was perhaps best embodied by my favorite opening talk at a conference ever (AVA 2025) where Wayne and Lewis, leaders with very different historical approaches to helping animals, were able to share lessons, have a friendly debate and drive home the message of how similar our goals really are. This would have been nearly unthinkable decades ago (and in fact resulted in shouting matches when it was attempted). But the cult