Yeah, fair question, though I think both estimating the numerator and the denominator is tricky. Probably your estimate that I know very roughly ~150-250 EAs is approximately right. But I'd be nervous about a conclusion of "this problem only affects 1 in 50, so it's pretty rare/not a big deal," both because I think the 3-5 number is more about specific people I've been interacting with a lot recently who directly inspired this post (so there could be plenty more I just know less about), and because there's also a lot of room for interpretation of how strongly people resonate with different parts of this / how completely they've disengaged from the community / etc.
Before writing the post, I was maybe thinking of 3-5 people who have experienced different versions of this? And since posting I have heard from at least 3 more (depending how you count) who have long histories with EA but felt the post resonated with them.
So far the reactions I've got suggest that there are quite a lot of people who are more similar to me (still engage somewhat with EA, feel some distance but have a hard time articulating why). That might imply that this group is a larger proportion than the group that totally disengages... but the group that totally disengages wouldn't see an EA forum post, so I'm not sure :)
I'm delighted that you went ahead and shared that the tone felt off to you! Thank you. You're right that I didn't really run this by any newcomers, so that's on me.
(By way of explanation, but not excuse: I mostly wrote the piece while thinking of the main audience as being people who were already partway through the disillusionment pipeline - but then towards the end edited in more stuff that was relevant to newcomers, and didn't adjust who I ran it by to account for that.)
I like this! Thanks for sharing it.
Another analogy I've been playing around with* is "having an impact isn't a sprint or a marathon - it's an endurance hunt." Things I like about this include:
I think your analogy about breathing carries over too - just like on a hike, if you're hunting in a group then no one is helped by you pretending you have more stamina than you do.
Two flaws with this analogy are 1) it's not the friendliest for vegetarians, lol and 2) there seems to be some controversy over whether persistence hunting is even a thing? Hiking is much better on both of those points!
*Read: started drafting a post, then let it languish for months
I am in contact with a couple of other funding sources who would take recommendations from me seriously, but this fund is the place I have most direct control over.
Both Matts are long-time earn-to-givers, so they each make grants/donations from their own earnings as well as working with this fund.
This is a great comment. If I were to rewrite this post now, I would make sure to include these.
Also, going back to a conversation with you: if I were to rewrite, I would also try to make it clearer that I'm not trying to give a formal definition of Effective Altruism (which is what it sounds like in the post), just trying to change the feeling or connotations around it, and how we think about it.
Hm... thinking in terms of 2 types of claim doesn't seem like much of an improvement over thinking in terms of 1 type of claim, honestly. I was not at all trying to say "there are some things we're really sure of and some things we're not." Rather, I was trying to point out that EA is associated with a bunch of different ideas; how solid the footing of each idea is varies a lot, but how those ideas are discussed often doesn't account for that. And by "how solid" I don't just mean on a 1-dimensional scale from less to more solid—more like, the relevant evidence and arguments and intuition and so on all vary a ton, so it's not just a matter of dialing up or down the hedging.
A richer framing for describing this that I like a lot is Holden's "avante-garde effective altruism" (source):
I don't think it has to be that complicated to work this mindset into how we think and talk about EA in general. E.g. you can start with "There's reason to believe that different approaches to doing good vary a ton in how much they actually help, so it's worth spending time and thought on what you're doing," then move to "For instance, the massive income gap between countries means that if you're focusing on reducing poverty, your dollar goes further overseas," and then from there to "And when people think even more about this, like the EA community has done, there are some more unintuitive conclusions that seem pretty worthy of consideration, for instance..." and then depending on the interaction, there's space to share ideas in a more contextualized/nuanced way.
That seems like a big improvement over the current default, which seems to be "Hi, we're the movement of people who figure out how to do the most good, here are the 4 possibilities we've come up with, take your pick," which I agree wouldn't be improved by "here are the ones that are definitely right, here are the ones we're not sure about."