Here's a great (and long, 40min) article that maps the Western cultural space: Memetic Tribes and Culture War 2.0.

Effective Altruism is part of the mapping, under the label "Rationalist Diaspora". The authors write:

Rationalist Diaspora: Incubated on Overcoming Bias and LessWrong, this is an observer tribe in the culture war. Though similar to the New Atheists in that they prize rationality, they do not define themselves in opposition to religion. Thanks to the strength of Eliezer Yudkowsky and Scott Alexander’s writing, and the beliefs and epistemic virtues of the diaspora, they command increasing respect in the culture war. Watch for a popularity boost to Effective Altruism, a struggle with the downsides of increased attention, and possible pressure by the SJAs for the Rationalists to commit to progressive values.

Not sure if this is the correct place to discuss articles like this (is it?), but I feel like it gives us a good mapping of "how we might spread" (i.e. promoting Effective Altruism). What are people's thoughts on this article, EA's place within it, and how it should change how we spread EA? (If at all.)

11

0
0

Reactions

0
0
Comments3
Sorted by Click to highlight new comments since:

I tend to be wary and distrustful when i see articles with titles like this---too often they are long and jargon filled (or made up jargon) and while usually basically coherent, tend to lack much in the way of innovative content--come off as school or university student papers (even if written by professors or professionals

This one is quite good and sort of amusing however. (I am not a fan of Jordan Peterson but i wonder if he goes to the events hosted by the authors of the article since he is in same city. ) I may use this article as a springboard for my own approach based on that.

Maybe calling us "observers" is right (same goes for the post-rationalist grouping, which I'm also part of): we have other things to focus on that are not politics and it seems like the culture war framing is a way of engaging with politics that is not likely to be effective to any ends I think most EAs would care about. However maybe there's a way to position EA so that people who burn out on what seems to me to be the futility of culture war they might think of EA as a sane, reasonable place where they can put their energy and desire to improve the world and have it applied in ways more likely to produce meaningful change?

I was nearly gonna post this article so glad you already have. I think it provides an interesting framework for understand peoples worldview (telos, existential threat etc). I have found it to be useful in discussing people's veiws with them: "What do you think is the purpose of life, what do you fear? etc"

Curated and popular this week
Relevant opportunities