Following up on the above, for anyone potential interested in taking part in this, please fill out this Expression of Interest form (deadline 31st March 2026). Looking forward to hearing from you!
I'm trying to set up a mentorship scheme matching up experienced social media creators with exceptional communicators interested in learning how to communicate high-impact ideas and information at scale using the medium of social media. This is as part of a wider effort to get more EAs with a diverse but previously under-utilised range of skills started on their impact journey.
What are some neglected, academic ideas / bits of knowledge that would benefit from being widely spread to the general public through the medium of social media?
and...
Do you know anyone who's extremely skilled at social media whom I could approach? Someone who would either be interested in making the content or coaching aspiring content creators?
Thanks in advance for your help!
Hi Nathan, thanks for the additional contextualisation.
I want to make it very clear that I am not against a productive and reasoned discussion on how best to manage trauma discussions within a social group, including on the possibility of someone claiming to be a victim in bad faith. But this article never gets to be that because it only considers the one scenario where the supposed victim is acting in bad faith.
The problem is precisely that the author never really considers the possibility that the "trauma junkies" might in fact be acting in good faith, until the very last section of the article ('word of caution'), at which point he remarkably still advocates for banishing them from the group just like you would a 'violent psychotic' or 'compulsive rapist'. There is no consideration of tradeoffs, e.g. what effect this may have on the so-called 'trauma junky' if they turned out to be sincere . Of course, we know that in such cases, the effects are quite disastrous, and you can read Fran's recent post CEA's response to sexual harassment to see what that's like. So the article by Eneasz Brodski is not really an interesting, nuanced, substantiated and earnest discussion on a delicate topic, such as one that might be produced using a scout's mindset, or indeed rationalist ideals. For an example of something that I think is exactly that, read Fran's post, and you will find in it concrete examples where people who do think of themselves as rationalists and Effective Altruists did make mistakes that resulted in the effective dismissal of a serious case of sexual harassment. She explains the situation with a remarkable amount of calm and detachment and did so out of a genuine desire to see how similar situations can be handled better in the future.
I do think it would be genuinely concerning if prominent rationalists were generally dismissive of sexual assault and harassment.
Well... You may now be concerned.
Aside from the examples from Fran above, you might be interested in this article, which lists "views discussed involve racism, sexism, fascism, and other troubling ideologies" expressed at an EA afterparty. Full disclosure, I haven't read this article in full, and I'm not fully briefed on the whole Manifold/Manifest drama from 2024, but from a brief skim the article seems to list quite a few 'interesting' statements from a few prominent rationalists (most of whom I must admit I had never heard of because I am quite new to learning about this... and don't really feel a huge compulsion to explore, tbh). See for example this excerpt from Richard Hanania's article The EA Movement Will Be Anti-Woke or Die:
As I’ve previously written, there are certain psychological dynamics that explain why wokeness has conquered western institutions and movements, and only the ones with antibodies to women’s tears will avoid drowning in them.
Does EA have the right antibodies? I once assumed it did based on its ideological and intellectual commitments. But I’m now realizing that the main reason rationalism has been relatively unwoke so far is that the movement has been new, and new movements attract adherents that are disproportionately highly intelligent, non-conformist, and male. But as it has grown in status, the movement has diversified, which has brought all the usual problems.
(Again, full disclosure, haven't particularly felt that keen to read the full article, so I haven't. I don't know who this Richard Hanania is in much detail but he clearly is quite involved in the EA / Rationalist community).
These dangerous rationalisations of harmful -ims are not surprising to me as I don't place 'rationalists' a cut above any other category of people in their ability to reason and avoid bias. Even when one tries really hard not to make a mistake, everyone makes mistakes and no one is above making a mistake. But mistakes can have very real negative ethical consequences, which is why I advocated for calling them out early, in a genuine, respectful, and earnest manner in order for us to all flourish as a group. ⭐️
Thanks Ivy for sharing this article. Very sorry you were in any way affected by this though! Again, deeply enraging, but sadly very representative of other stuff I've seen and what I was referring to in my earlier comment.
For the benefit of readers of this thread, here's a taster of the article shared by @Ivy Astrix :
Any Community That Tolerates Trauma Junkies Is Unsafe For Everyone Else
You know the kind. The person that bends every gathering and interaction into a hunt for the problematic elements or people within it. The person who is not happy unless we’re all doing the work to eliminate the systemic oppression at our event. The person who loudly centers themselves as leading the charge to making a place or group or scene “safe” for everyone.
This person makes any space that they are let into radically unsafe for every normal person within that space, and destroy any communities they are let into. Tolerating them isn’t a kindness, it’s endangering your loved ones out of cowardice.
And for people who don't have the time to read it for themselves, the article broadly goes like this:
(1) Anyone sharing their trauma or cautioning about causing trauma (people the article refers to as "trauma junkies") is most probably doing so out of bad faith and a desire to gain attention.
(2) Because it is most probably fake, you would be justified as seeing those people as problematic and net negatives to your community.
(3) Therefore, to safeguard your community, you should probably either shut down any attempts they make to broach the subject or just exclude them altogether.
After about 1300 words spent outlining the "argument", and making the case that trauma junkies are causing "draining leadership resources", causing "crumbling communities" and that it is a moral duty for leaders to stop them, the author writes a "word of concession" that
nonetheless reaches quite extreme conclusions on what kind of actions should be taken against those 'trauma junkies' namely that, "for the protection of everyone else they must be isolated, just like the violent psychotic or compulsive rapist must be isolated".
A Word of Concession
Ok, fine. Perhaps trauma junkies “can’t help” the way they are. They’re traumatized themselves, hypervigilant PTSD victims of social media and political warfare. That doesn’t change the fact that they are dangerous to everyone around them. For the protection of everyone else they must be isolated, just like the violent psychotic or compulsive rapist must be isolated. Whether they are “at fault” or “morally responsible” for their behavior is irrelevant. Our first duty is to protect our families and our community. Or these trauma junkies will perpetuate the cycle and hurt more people into further trauma narratives.
I don't think I need to spend time explaining the flaws in this 'argument'.
What's scary is that this person, whom I don't know at all but from looking at their blog and recommendations (Harry Potter and the Methods of Rationality, Scott Alexander stuff) would probably identify themselves as belonging to the rationalist community, can't see the issues with what they are writing. Or maybe they've decided that having such views and 'casual' (read, extremely low to non-existent) epistemic standards is... compatible with being a rationalist??! Clearly they have, as this article has been publicly published for 2 years now.
The extremely confident tone makes it clear that the author is not interested in having a constructive debate on the best way to handle trauma in communities. It's an attempt at banter, making fun at others, and basically branding oneself as a 'cool contrarian'. Exactly what I described in my earlier comment.
The confident tone is also an indication that the kind of social groups this person inhabits, which by all accounts are at least adjacent to rationalist circles (I scanned through the people who liked it and what they subscribe to on Substack), permits such low-quality discourse with basically no challenge (only one of the comments, although generally positive towards the message of the article, seems to gently suggest that the author should develop more empathy - it's not written very clearly though so I wouldn't know for sure).
If you've never come across this kind of behaviour before, imagine what it feels like to be the target of such an article. Especially when it looks like people are supporting it.
What's the solution? In my view, three things can really help:
- Call out the flaws in someone's arguments or behaviours whenever they happen. The issue is not that flawed arguments resulting in actual harm to people occasionally happen (they are bound to, nobody's perfect), but that they are allowed to go on with impunity.
- Remember that no discourse is ever disentangled from broader context. No idea is ever expressed by a 'disembodied spirit' (even LLMs are basing themselves on outputs from very real people with real biases). No discursive arena is ever truly fair, and gives true opportunity to everyone to express themselves and be heard in an equal way. We can try and make it that way, but this necessitates some awareness about the nature of potential power imbalances, and an anticipatory attempt to mitigate them. In my opinion, someone embodying real scout mindset looks out for signals of an uneven playing field, and makes the necessary adjustment to compensate them as necessary in a sincere attempt to getting at the truth.
- Have epistemic humility, and be wary of views that seem to radically diverge from traditional wisdom. Of course, sometimes, traditional wisdom is very wrong, and needs correcting. But in my view, the most-likely-to-be-morally successful challengers of traditional wisdoms are those who do so with humility and caution, and with an earnestness that is comparable to the magnitude of the change they are proposing (for a very helpful and eloquent description of this approach, see this talk by Toby Ord. I really loved it). Be wary of people who are super light-hearted and bantery when making a proposition that massively contradicts conventional wisdom, as chances are the tone they are using is not just a matter of style, it's a tactical decision to try and make people overlook the flaws of their arguments, gather virality and ultimately gain influence regardless of the merits of their reasoning (the old art of sophistry - gaining support for an argument without having solid substance because of surface-level tricks).
Yes exactly, the sexist/racist "proofs" are not really proofs, because they are fundamentally flawed, but the originators of such proofs refuse to hear any opposition. It's really an anti rationalist attitude dressed as a super rationalist attitude. As you say they're not arguing in good faith (scout mindset), they're just really committed to keeping their position (soldier mindset)
In a Moroccan context, I think it's definitely a worthwhile experiment to try to harness the donation potential of the software developer or lawyer and get them to direct some of their donations to high-impact charities, but one for which there's not much precedent.
One thing I forgot to mention which might fit your middle group is corporate giving: big companies/banks or local franchises of international groups (e.g. McDonald's) have charity partnerships. This is quite common.
Do you think EA communities underinvest in developing donors outside high-income countries?
My instinct is yes, based from what I observe in my sector (arts).
People like Aubrey Bergauer (an arts business guru specialising in creating sustainable revenue streams for arts organisation) keep saying that organisations routinely miss out on steady revenue stream right under their noses by focussing too much on High Net Worth individuals or institutional funding opportunities and completely forgetting about the people who use their services / can see the impact it makes in their community as potential donors.
Yes, those people have much smaller donation capabilities on an individual level, but compounded together they represent a non-trivial source of income. They are also beneficial in other ways in the sense that, with the right care, they are highly involved and might turn into a free source of advertising / advocating for your cause. However...
If you are in an LMIC and have considered donating effectively: what made it easier or harder?
I'm from Morocco and even though charity as a concept is widely practised (mostly because of religious reasons / culture), the notion of donation effectiveness is almost never a consideration within the general public (even less so than in developed countries). Or RATHER, effectiveness is assessed on a very very personal level (i.e. I personally know the family I am giving to, and know that my donation will be put to good use), but not through a dispassionate, EA-like approach that looks at total effectiveness potential of a donation in the abstract, detached of any personal connection to the people / animals you could be helping.
(There is also much of the kind of 'patronage' donation style that @NickLaing is mentioning when talking about the Nigerian context)
So in summary, although there might be something to gain by mobilising a local population that is both used to charitable giving and is closer to the impact of your work, you might find that because the notions of effective giving and rigorous consideration of impact are not nearly as implanted as in developed countries, you might have less success in achieving donations from local populations, especially as an animal advocacy group, which, unless you view it through the lens of non-species specific quantified impact (a lens that is often missing in the general population of many LMIC countries), might unfavourably compare to human-centric causes.
IMO, the worst sub-group is the intersection formed by the group of people who call themselves 'rationalists', those who have sexist views, and those who are looking to be edgy/gain some notoriety. This is because this group will often try to use their 'rationalism' to justify their harmful '-isms' (sexism/racism etc)... using """"data"""" and """reason""", which generates a lot of controversy which helps them build more of a platform, etc.
In my opinion this is the most dangerous subcategory of sexist people (as opposed to the people who are just casually sexist out of convenience / or just because they can, but don't have further motives beyond that) because if you dare questioning their methods or conclusions they call you 'woke', 'irrational', or 'unscientific' (by contrast, the former category will just accuse you of being too uptight, lacking a sense of humour, or making a mountain out of a molehill). These pseudo-rationalist people are dangerous because they are not simply being sexist, they are actively making the apology of sexism. As a woman, you can't win against them, because you're either agreeing with them that women are less smart/capable/intelligent, or you're disagreeing with their 'highly rationalist proof', which they will claim proves their point, as you're 'clearly' not clever or free-thinking enough to appreciate the 'evidence'. This of course helps them get more attention, as more and more people either want to strongly agree or strongly disagree with them. Online, this behaviour drives comments, likes, and algorithmic traffic towards their profile, which serves their notoriety goals.
I've met my fair share of these over the years.
I also think that AGI is altogether still quite unlikely in the next decade, but I don't need AGI happening in the next decade to be worried about AI's current ability to destabilise our world in a meaningful and potentially catastrophic way.
My main concern is that the pre-AI world was, IMO, not even as prepared as it could have been on "traditional risks": old risks like cyber attacks, geopolitical instability, military escalation, democracy erosion, and so on. I see AI as a complicating factor and a multiplier of those risks and my cautious nature makes me think we should hurry even more in disaster preparedness in general.
Even without AGI in the picture, I think we are under prepared to deal with the risks associated with misuse current AI capabilities, which really just makes it cheaper and easier to do things like cyber attacks and disinformation campaigns at scale (and other things too, like building biological weapons etc). I'm also very concerned about the models being used in the military to launch missiles and eliminate targets without human oversight. These are things that are already happening and I think we are still not devoting enough attention to.
In summary, because I feel we are not prepared enough TODAY, I see efforts to 1) limit the growth of AI capabilities and 2) have better safeguards against misuse of current capabilities as still important and valuable.
It's very very possible that AI capabilities' growth will be halted or massively slowed down anyway due to a number of factors that you have already discussed (such as the AI bubble popping, or bottlenecks in hardware materials and so on), and I would cautiously welcome those as net positive things (for the reasons I mentioned), but I would also welcome any voluntary efforts to curtail the growth of AI future capabilities / increase safety, world cooperation, and regulations around current capabilities as a way to buy us time to become better prepared.
Haven't watched the videos yet but love the concept of using social media to educate the public about important ideas / spread important knowledge! 😊