This is a special post for quick takes by Saul Munn. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

why do i find myself less involved in EA?

epistemic status: i timeboxed the below to 30 minutes. it's been bubbling for a while, but i haven't spent that much time explicitly thinking about this. i figured it'd be a lot better to share half-baked thoughts than to keep it all in my head — but accordingly, i don't expect to reflectively endorse all of these points later down the line. i think it's probably most useful & accurate to view the below as a slice of my emotions, rather than a developed point of view. i'm not very keen on arguing about any of the points below, but if you think you could be useful toward my reflecting processes (or if you think i could be useful toward yours!), i'd prefer that you book a call to chat more over replying in the comments. i do not give you consent to quote my writing in this short-form without also including the entirety of this epistemic status.

  • 1-3 years ago, i was a decently involved with EA (helping organize my university EA program, attending EA events, contracting with EA orgs, reading EA content, thinking through EA frames, etc).
  • i am now a lot less involved in EA.
    • e.g. i currently attend uc berkeley, and am ~uninvolved in uc berkeley EA
    • e.g. i haven't attended a casual EA social in a long time, and i notice myself ughing in response to invites to explicitly-EA socials
    • e.g. i think through impact-maximization frames with a lot more care & wariness, and have plenty of other frames in my toolbox that i use to a greater relative degree than the EA ones
    • e.g. the orgs i find myself interested in working for seem to do effectively altruistic things by my lights, but seem (at closest) to be EA-community-adjacent and (at furthest) actively antagonistic to the EA community
  • (to be clear, i still find myself wanting to be altruistic, and wanting to be effective in that process. but i think describing my shift as merely moving a bit away from the community would be underselling the extent to which i've also moved a bit away from EA's frames of thinking.)
  • why?
    • a lot of EA seems fake
      • the stuff — the orientations — the orgs — i'm finding it hard to straightforwardly point at, but it feels kinda easy for me to notice ex-post
    • there's been an odd mix of orientations toward [ aiming at a character of transparent/open/clear/etc ] alongside [ taking actions that are strategic/instrumentally useful/best at accomplishing narrow goals... that also happen to be mildly deceptive, or lying by omission, or otherwise somewhat slimy/untrustworthy/etc ]
      • the thing that really gets me is the combination of an implicit (and sometimes explicit!) request for deep trust alongside a level of trust that doesn't live up to that expectation.
        • it's fine to be in a low-trust environment, and also fine to be in a high-trust environment; it's not fine to signal one and be the other. my experience of EA has been that people have generally behaved extremely well/with high integrity and with high trust... but not quite as well & as high as what was written on the tin.
      • for a concrete ex (& note that i totally might be screwing up some of the details here, please don't index too hard on the specific people/orgs involved): when i was participating in — and then organizing for — brandeis EA, it seemed like our goal was (very roughly speaking) to increase awareness of EA ideas/principles, both via increasing depth & quantity of conversation and via increasing membership. i noticed a lack of action/doing-things-in-the-world, which felt kinda annoying to me... until i became aware that the action was "organizing the group," and that some of the organizers (and higher up the chain, people at CEA/on the Groups team/at UGAP/etc) believed that most of the impact of university groups comes from recruiting/training organizers — that the "action" i felt was missing wasn't missing at all, it was just happening to me, not from me. i doubt there was some point where anyone said "oh, and make sure not to tell the people in the club that their value is to be a training ground for the organizers!" — but that's sorta how it felt, both on the object-level and on the deception-level.
      • this sort of orientation feels decently reprensentative of the 25th percentile end of what i'm talking about.
    • also some confusion around ethics/how i should behave given my confusion/etc
      • importantly, some confusions around how i value things. it feels like looking at the world through an EA frame blinds myself to things that i actually do care about, and blinds myself to the fact that i'm blinding myself. i think it's taken me awhile to know what that feels like, and i've grown to find that blinding & meta-blinding extremely distasteful, and a signal that something's wrong.
        • some of this might merely be confusion about orientation, and not ethics — e.g. it might be that in some sense the right doxastic attitude is "EA," but that the right conative attitude is somewhere closer to (e.g.) "embody your character — be kind, warm, clear-thinking, goofy, loving, wise, [insert more virtues i want to be here]. oh and do some EA on the side, timeboxed & contained, like when you're donating your yearly pledge money."
  • where now?
    • i'm not sure! i could imagine the pendulum swinging more in either direction, and want to avoid doing any further prediction about where it will swing for fear of that prediction interacting harmfully with a sincere process of reflection.
    • i did find writing this out useful, though!

Thanks for sharing your experiences and reflections here — I really appreciate the thoughtfulness. I want to offer some context on the group organizer situation you described, as someone who was running the university groups program at the time.

On the strategy itself:
 At the time, our scalable programs were pretty focused from evidence we had seen that much of the impact came from the organizers themselves. We of course did want groups to go well more generally, but in deciding where to put our marginal resource we were focusing on group organizers. It was a fairly unintuitive strategy — and I get how that could feel misaligned or even misleading if it wasn’t clearly communicated.

On communication:
 We did try to be explicit about this strategy — it was featured at organizer retreats and in parts of our support programming. But we didn’t consistently communicate it across all our materials. That inconsistency was an oversight on our part. Definitely not an attempt to be deceptive — just something that didn’t land as clearly as we hoped.

Where we’re at now:
 We’ve since updated our approach. The current strategy is less focused narrowly on organizers and more on helping groups be great overall. That said, we still think a lot of the value often comes from a small, highly engaged core — which often includes organizers, but not exclusively.

In retrospect, I wish we’d communicated this more clearly across the board. When a strategy is unintuitive, a few clear statements in a few places often isn’t enough to make it legible. Sorry again if this felt off — I really appreciate you surfacing it.

Thanks for clarifying your take!

I'm sorry to hear about those experiences. 

Most of the problems you mention seem to be about the specific current EA community, as opposed to the main values of "doing a lot of good" and "being smart about doing so."

Personally, I'm excited for certain altruistic and smart people to leave the EA community, as it suits them, and do good work elsewhere. I'm sure that being part of the community is limiting to certain people, especially if they can find other great communities. 

That said, I of course hope you can find ways for the key values of "doing good in the world" and similar to work for you. 

I appreciated you expressing this.

Riffing out loud ... I feel that there are different dynamics going on here (not necessarily in your case; more in general):

  1. The tensions where people don't act with as much integrity as is signalled
    • This is not a new issue for EA (it arises structurally despite a lot of good intentions, because of the encouragement to be strategic), and I think it just needs active cultural resistance
      • In terms of writing, I like Holden's and Toby's pushes on this; my own attempts here and here
      • But for this to go well, I think it's not enough to have some essays on reading lists; instead I hope that people try to practice good orientation here at lots of different scales, and socially encourage others to
  2. The meta-blinding
    • I feel like I haven't read much on this, but it rings true as a dynamic to be wary of! Where I take the heart of the issue to be that EA presents a strong frame about what "good" means, and then encourages people to engage in ways that make aspects of their thinking subservient to that frame
  3. As someone put it to me, "EA has lost the mandate of heaven"
    • I think EA used to be (in some circles) the obvious default place for the thoughtful people who cared a lot to gather and collaborate
      • I think that some good fraction of its value came from performing this role?
    • Partially as a result of 1 and 2, people are disassociating with EA; and this further reduces the pull to associate
    • I can't speak to how strong this effect is overall, but I think the directionality is clear

I don't know if it's accessible (and I don't think I'm well positioned to try), but I still feel a lot of love for the core of EA, and would be excited if people could navigate it to a place where it regained the mandate of heaven.

"why do i find myself less involved in EA?"

You go over more details later and answer other questions like what caused some reactions to some EA-related things, but an interesting thing here is that you are looking for a cause of something that is not.

> it feels like looking at the world through an EA frame blinds myself to things that i actually do care about, and blinds myself to the fact that i'm blinding myself.

I can strongly relate, had the same experience. i think it's due to christian upbringing or some kind of need for external validation. I think many people don't experience that, so I wouldn't say that's an inherently EA thing, it's more about the attitude. 

 

Curated and popular this week
Relevant opportunities