I've seen a lot of discussion in the EA community recently about the divide between people who think EA should focus on high-level philosophical arguments/thoughts, and those who think EA should work on making our core insights more appealing to the public at large.
This last year the topic has become increasingly salient, the big shifts from my perspective being Scott Alexander's Open EA Global post, the FTX crash, and the Wytham Abbey purchase. I quite frequently see those in the first camp, people not wanting to prioritize social capital, use the argument that epistemics in EA have declined.
To those who haven't studied philosophy, epistemics broadly refers to the idea of knowledge itself, or the study of how we gain knowledge, sort out good from bad, etc. As someone who is admittedly on the side of growing EA's social capital, when I see the argument that the community's epistemics have declined it tends to assume a number of things, namely:
- It is a simple matter to judge who has high quality epistemics
- Those with high quality epistemics usually agree on similar things
- It's a given that the path of catering to a smaller group of people with higher quality epistemics will have more impact than spreading the core EA messaging to a larger group of people with lower quality epistemics
In the spirit of changing EA forum discussion norms, I'll go ahead and say directly that my immediate reaction to this argument is something like: "You and the people who disagree with me are less intelligent than I am, the people who agree with me are smarter than you as well." In other words, it feels like whoever makes this argument is indirectly saying my epistemics are inferior to theirs.
This is especially true when someone brings up the "declining epistemics" argument to defend EA orgs from criticism, like in this comment. For instance, the author writes:
"The discussion often almost completely misses the direct, object-level, even if just at back-of-the-envelope estimate way."
I'd argue that by bemoaning the intellectual state of EA, one risks focusing entirely on the object-level when in a real utilitarian calculus, things outside the object level can matter much more than the object level itself. The Wytham Abbey purchase is a great example.
This whole split may also point to the divergence between rationalists and newer effective altruists.
My reaction is admittedly not extremely rational, well thought out, and doesn't have high quality epistemics backing it. But it's important to point out emotional reactions to the arguments we make, especially if we ever intend to convince the public of Effective Altruism's usefulness.
I don't have any great solutions to this debate, but I'd like to see less talk of epistemic decline in the EA forum, or at least have people state it more blatantly rather than dressing up their ideas in fancy language. If you think that less intelligent or thoughtful people are coming into the EA movement, I'd argue you should say so directly to help foster discussion of the actual topic.
Ultimately I agree that epistemics are important to discuss, and that the overall epistemics of discussion in EA related spaces has gone down. However I think the way this topic is being discussed and leveraged in arguments is toxic to fostering trust in our community, and assumes that high quality epistemics is a good in itself.
Thanks for sharing your side here. That seems really frustrating.
This is definitely a heated topic, and it doesn't seem like much fun at any side of it.
I think in the background, there are different clusters of people who represent different viewpoints regarding philosophical foundations and what good thinking looks like. Early EA was fairly small, and correspondingly, it favored some specific and socially unusual viewpoints.
Expanding "EA" to more people is really tricky. It's really difficult to expand in ways where newer people are highly aligned with clusters of the "old guard."
I think it's important to see both sides of this. Newer people really hate feeling excluded, but at the same time, if the group grows too much, then we arguably lose many of the key characteristics that make EA valuable in the first place. If we started calling everything and everyone "EA", then there basically wouldn't be EA.
Frustratingly, I think it's difficult to talk about on either end. My guess is that much of the uneasiness is not said at all, in part because people don't want to get attacked. I think saying "epistemics are declining" is a way of trying to be a bit polite. A much more clear version of this might look like flagging specific posts or people as good or bad, but no one wants to publicly shame specific people/posts.
So, right now we seem to be in a situation where both sides of this don't particularly trust each other, and also are often hesitant to say things to each other, in large part because they don't trust the other side.
For what it's worth, as good as some people might think their/EA epistemics are, the fact that this discussion is so poor I think demonstrates that there's a lot of improvement to do. (I think it's fair to say that being able to discuss heated issues online is a case of good epistemics, and now our community isn't doing terrific here.)
If I were to extract generalizable lessons from the FTX, the major changes I would make are:
EA should stay out of crypto, until and unless the situation improves to the extent that it doesn't have to rely on speculators. One big failure is EAs thought they could invest in winner stocks more than other investors.
Good Governance matters. By and large, EA failed at basic governance tasks, and I think governance needs to be improved. My thoughts are similar to this post:
https://forum.effectivealtruism.org/posts/sEpWkCvvJfoEbhnsd/the-ftx-crisis-highlig... (read more)