Edit: If you are landing here from the EA Forum Digest, note that this piece is not about Manifest and I don't want it it to be framed as being about Manifest.
Recently, I've noticed a growing tendency within EA to dissociate from Rationality. Good Ventures have stopped funding efforts connected with the rationality community and rationality, and there are increasing calls for EAs to distance themselves.
This trend concerns me, and I believe it's good to make a distinction when considering this split.
We need to differentiate between 'capital R' Rationality and 'small r' rationality. By 'capital R' Rationality, I mean the actual Rationalist community, centered around Berkeley: A package deal that includes ideas about self-correcting lenses and systematized winning, but also extensive jargon, cultural norms like polyamory, a high-decoupling culture, and familiarity with specific memes (ranging from 'Death with Dignity' to 'came in fluffer').
On the other hand, 'small r' rationality is a more general concept. It encompasses the idea of using reason and evidence to form conclusions, scout mindset, and empiricism. It also includes a quest to avoid getting stuck with beliefs resistant to evidence, techniques for reflecting on and improving mental processes, and, yes, many of the core ideas of Rationality, like understanding Bayesian reasoning.
If people want to distance themselves, it's crucial to be clear about what they're distancing from. I understand why some might want to separate from aspects of the Rationalist community – perhaps they dislike the discourse norms, worry about negative media coverage, or disagree with prevalent community views.
However, distancing yourself from 'small r' rationality is far more radical and likely less considered. It's similar to rejecting core EA ideas like scope sensitivity or cause prioritization just because one dislikes certain manifestations of the EA community (e.g., SBF, jargon, hero worship).
Effective altruism is fundamentally based on pursuing good deeds through evidence, reason, and clear thinking - in fact when early effective altruists were looking for a name, one of the top contenders was rational altruism. Dissecting the aspiration to think clearly would in my view remove something crucial.
Historically, the EA community inherited a lot of epistemic aspects from Rationality[1] – including discourse norms, emphasis on updating on evidence, and a spectrum of thinkers who don't hold either identity closely, but can be associated with both EA and rationality. [2]
Here is the crux: if the zeitgeist pulls effective altruists away from Rationality, they should invest more into rationality, not less. As it is critical for effective altruism to cultivate reason, someone will need to work on it. If people in some way connected to Rationality are not who EAs will mostly talk to, someone else will need to pick up the baton.
- ^
Clare Zabel in 2022 expressed similar worry:
Right now, I think the EA community is growing much faster than the rationalist community, even though a lot of the people I think are most impactful report being really helped by some rationalist-sphere materials and projects. Also, it seems like there are a lot of projects aimed at sharing EA-related content with newer EAs, but much less in the way of support and encouragement for practicing the thinking tools I believe are useful for maximizing one’s impact (e.g. making good expected-value and back-of-the-envelope calculations, gaining facility for probabilistic reasoning and fast Bayesian updating, identifying and mitigating one’s personal tendencies towards motivated or biased reasoning). I’m worried about a glut of newer EAs adopting EA beliefs but not being able to effectively evaluate and critique them, nor to push the boundaries of EA thinking in truth-tracking directions.
- ^
EA community actually inherited more than just ideas about epistemics: compare for example Eliezer Yudkowsky's essay on Scope Insensitivity from 2007 with current introductions to effective altruism in 2024.
I don't think so. I think in practice
I. - Some people don't like the big R community very much.
AND
2a. - Some people don't think improving the EA community small-r rationality/epistemics should be one of top ~3-5 EA priorities.
OR
2b. - Some people do agree this is important, but don't clearly see the extent to which the EA community imported healthy epistemic vigilance and norms from Rationalist or Rationality-adjacent circles
=>
- As a consequence, they are at risk of distancing from small r rationality as a collateral damage / by neglect
Also I think many people in the EA community don't think it's important to try hard at being small-r rational at the level of aliefs. No matter what is the actual situation revealed by actual decisions, I would expect the EA community to at least pay lip service to epistemics and reason, so I don't think stated preferences are strong evidence.
"Being against small-r rationality is like being against kindness or virtue; no one thinks of themselves as taking that stand."
Yes I do agree almost no one thinks about themselves that way. I think it is maybe somewhat similar to "Being against effective charity" - I would be surprised if people though about themselves that way.