I've seen a lot of discussion in the EA community recently about the divide between people who think EA should focus on high-level philosophical arguments/thoughts, and those who think EA should work on making our core insights more appealing to the public at large.
This last year the topic has become increasingly salient, the big shifts from my perspective being Scott Alexander's Open EA Global post, the FTX crash, and the Wytham Abbey purchase. I quite frequently see those in the first camp, people not wanting to prioritize social capital, use the argument that epistemics in EA have declined.
To those who haven't studied philosophy, epistemics broadly refers to the idea of knowledge itself, or the study of how we gain knowledge, sort out good from bad, etc. As someone who is admittedly on the side of growing EA's social capital, when I see the argument that the community's epistemics have declined it tends to assume a number of things, namely:
- It is a simple matter to judge who has high quality epistemics
- Those with high quality epistemics usually agree on similar things
- It's a given that the path of catering to a smaller group of people with higher quality epistemics will have more impact than spreading the core EA messaging to a larger group of people with lower quality epistemics
In the spirit of changing EA forum discussion norms, I'll go ahead and say directly that my immediate reaction to this argument is something like: "You and the people who disagree with me are less intelligent than I am, the people who agree with me are smarter than you as well." In other words, it feels like whoever makes this argument is indirectly saying my epistemics are inferior to theirs.
This is especially true when someone brings up the "declining epistemics" argument to defend EA orgs from criticism, like in this comment. For instance, the author writes:
"The discussion often almost completely misses the direct, object-level, even if just at back-of-the-envelope estimate way."
I'd argue that by bemoaning the intellectual state of EA, one risks focusing entirely on the object-level when in a real utilitarian calculus, things outside the object level can matter much more than the object level itself. The Wytham Abbey purchase is a great example.
This whole split may also point to the divergence between rationalists and newer effective altruists.
My reaction is admittedly not extremely rational, well thought out, and doesn't have high quality epistemics backing it. But it's important to point out emotional reactions to the arguments we make, especially if we ever intend to convince the public of Effective Altruism's usefulness.
I don't have any great solutions to this debate, but I'd like to see less talk of epistemic decline in the EA forum, or at least have people state it more blatantly rather than dressing up their ideas in fancy language. If you think that less intelligent or thoughtful people are coming into the EA movement, I'd argue you should say so directly to help foster discussion of the actual topic.
Ultimately I agree that epistemics are important to discuss, and that the overall epistemics of discussion in EA related spaces has gone down. However I think the way this topic is being discussed and leveraged in arguments is toxic to fostering trust in our community, and assumes that high quality epistemics is a good in itself.
I'm just going to register a disagreement that I think is going to be a weird intersection of opinions. I despise posting online but here goes. I think this post is full of applause lights and quite frankly white psychodrama.
I'm a queer person of colour and quite left-wing. I really disliked Bostrom's letter but still lean hard on epistemics being important. I dislike Bostrom's letter because I think it is an untrue belief and equivocates out of grey tribe laziness. But reading a lot of how white EAs write about being against the letter it sounds like you're more bothered by issues of social capital and optics for yourself not for any real impact reason.
The reason I believe this is because of two reasons:
1. This post bundles together the Bostrom letter and Wyntham. I personally think Wyntham quite possibly could be negative EV (mostly because I think Oxford real estate is inflated and the castle is aesthetically ugly and not conducive to good work being done). But the wrongness in the Bostrom isn't that it looks bad. I am bothered by Bostrom holding a wrong belief not a belief that is optically bad.
2. You bundled in AI safety in later discussions about this. But there are lots of neartermist causes that are really weird e.g. shrimp welfare. Your job as a community builder isn't to feel good and be popular it's to truth-seek and present morally salient fact. The fact AI safety is the hard one for you speaks to a cohort difference not anything particular about these issues. For instance, in many silicon valley circles AI safety makes EA more popular!
Lastly, I don't think the social capital people actually complete the argument for full implication of what it means for EA to become optics aware. Do we now go full Shorism and make sure we have white men in leadership positions so we're socially popular? The discussion devolved to the meta-level of epistemics because the discussion is often low quality and so for it to even continue to do object-level utilitarian calculus to exist because we're doing group decision-making. It all just seems like a way to descend into respectability politics and ineffectiveness. I want to be part of a movement that does what's right and true not what's popular.
On a personal emotional note, I can't help but wonder how the social capital people would act in previous years with great queer minds. It was just a generation ago that queer people were socially undesirable and hidden away. If your ethics are so sensitive to the feelings of the public I frankly do not trust it. I can't help but feel a lot of the expressions of fears by mostly white EAs in these social capital posts are status anxieties about their inability to sit around the dinner table and brag about their GiveWell donations.
Yep!