I am the Principal Research Director at Rethink Priorities. I lead our Surveys and Data Analysis department and our Worldview Investigation Team.
The Worldview Investigation Team previously completed the Moral Weight Project and CURVE Sequence / Cross-Cause Model. We're currently working on tools to help EAs decide how they should allocate resources within portfolios of different causes, and to how to use a moral parliament approach to allocate resources given metanormative uncertainty.
The Surveys and Data Analysis Team primarily works on private commissions for core EA movement and longtermist orgs, where we provide:
Formerly, I also managed our Wild Animal Welfare department and I've previously worked for Charity Science, and been a trustee at Charity Entrepreneurship and EA London.
My academic interests are in moral psychology and methodology at the intersection of psychology and philosophy.
Survey methodology and data analysis.
Currently, the online EA ecosystem doesn’t feel like a place full of exciting new ideas, in a way that’s attractive to smart and ambitious people
This may be partly related to the fact that EA is doing relatively little cause and cross-cause prioritisation these days (though, since we posted this, GPI has wound down and Forethought has spun up).
People may still be doing within-cause, intervention-level prioritisation (which is important), but this may be unlikely to generate new, exciting ideas, since it assumes causes, and works only within them, is often narrow and technical (e.g. comparing slaughter methods), and is often fundamentally unsystematic or inaccessible (e.g. how do I, a grantmaker, feel about these founders?).
Thanks for the post! It's great to see analysis of the LEAF data and engagement with existing EA Survey data.
much of the current research suggests these events [conferences, local groups, and educational programs] to be largely ineffective in encouraging participants to engage further with EA communities
That is not my impression of the existing data.
For example, you cite the 2019 cause prioritization report to say that:
Rethink Priorities’ analysis of the 2019 EA Survey found that 42% of respondents reported changing their primary cause area after becoming involved with an EA community. However, relatively few respondents had actually made career or behavioural changes to align with EA priorities.
I'm afraid I don't understand the reason why you think this post suggests that claim. That post addressed cause prioritization, not behavioural changes, and I don't think whether people changed their cause prioritization since joining EA is a good proxy for them making changes to align with EA priorities. Most respondents already supported EA causes at the time of joining EA (though many switch between causes or change their relative prioritizations over time).
In the report on Engagement from that same year, we find that large numbers of EAs are taking actions aligned with EA priorities (e.g. making EA donations, changing their career plans, volunteering or working in EA jobs, etc.).
a 2023 study of EAGx conferences, which compared the attitudes and behaviours of attendees with non-attendees, found no statistically significant differences between the two groups
I couldn't find a post with the title you gave, but perhaps you are referring to this one? While I was very glad that they did the study, as I commented at the time, it was extremely under-powered, so finding non-significant effects was not surprising.
Participants' responses were tracked before and after the Leaf 2025 course to evaluate belief changes. Overall, the dataset revealed very little net change in views towards the statement ‘we should prioritise what is evidenced as best over what we emotionally prefer’.
I've not dug into the LEAF data in detail (and thank you again for analyzing it). But it looks like the main reason why there was very little increase in people's agreement with this statement was because respondents overwhelmingly agreed with it even in the pre condition. Mean ratings were 6.05 out of 7 at the start of the course, leaving almost no room for the score to go up.
Events like EAGx are rising in influence (15% of respondents now cite them as important in their EA journey), pushing conferences, meetups, social or other events is a high-leverage way to connect new people to the community.
As EA Survey from 2024 suggests, personal connections are one of the strongest channels through which people first hear about EA (17.9%) and go on to get involved (45%) — so your invites & referrals really move the needle and act in the similar way.
I agree. I would also add that personal contacts and EAGx are commonly cited as the largest positive influences on people's ability to have an impact: personal contact with EAs is the most commonly cited (42.3% of respondents), while EAGx is cited by 13.1% of respondents (which should be interpreted in light of the fact that only minority of EAs have ever attended an EAGx). These factors are both particularly influential for the most highly engaged EAs.
They are also both highly close to the top most commonly mentioned sources of people making interesting and valuable new connections (EAG/EAGx combined is top (31.6%), followed by personal contacts (30.8%), with EAGx specifically being 19.2% of respondents.
For reference, about 10% of EAs in the last EA Survey reported their career plan as working in government or policy. That's not very far behind the top categories, and it doesn't account for the 14% of respondents still deciding.
It seems David's comment below is particularly relevant here, and that it might be useful to have a two-way table of uptake rates? With University/General population on one axis and Passive/Active on the other. (Let me know if this exists and I'm missing it, otherwise if you agree this might be useful I can try and use any relevant surveys to estimate this)
Thanks Arthur! Unfortunately, I'm not sure that this data exists. It seems that we'd need to know both how many EA members there are at different universities and where they first heard of EA (perhaps CEA could gather this in future groups surveys).
We do have data about where people on campus in general had heard of EA.[1] Interestingly, ~0 of the people in our sample who seemed to have encountered EA (~220 people) seemed to be EAs themselves, which is itself somewhat suggestive of conversion rates.
As we can see, people on campus are more likely to say they heard of EA due to an EA group (14%), or a club fair (7%), that is probably likewise attributable to direct group activity. Some of the people who simply heard about EA around campus or from friends may also be attributable to group activity, but not have been directly outreached to. Many people clearly encountered EA only through more indirect means though, e.g. wider media, school or classes.[2]
Did not remember | 32 | 16.75% |
Friends | 32 | 16.75% |
EA Group (unspecified) | 26 | 13.61% |
Campus | 20 | 10.47% |
Class | 15 | 7.85% |
Club fair | 13 | 6.81% |
Online | 13 | 6.81% |
High school | 9 | 4.71% |
FTX / SBF | 8 | 4.19% |
Podcast | 5 | 2.62% |
Peter Singer (unspecified) | 3 | 1.57% |
Work | 3 | 1.57% |
Family | 2 | 1.05% |
Book (Peter Singer) | 2 | 1.05% |
Article | 1 | 0.52% |
Book | 1 | 0.52% |
Book (Precipice) | 1 | 0.52% |
Book (WWOTF) | 1 | 0.52% |
Books (DGB, Bostrom) | 1 | 0.52% |
News (Carrick Flynn) | 1 | 0.52% |
Book (DGB) | 1 | 0.52% |
TED (Singer) | 1 | 0.52% |
This excludes responses which did not give an interpretable answer as to where they had heard of EA.
Though it is worth bearing in mind that what we count as direct/indirect or higher/lower quality outreach is somewhat theoretically laden (and these dimensions can come apart). I recall, many years ago, it was more common to believe that people reading books would be 'high fidelity', and that groups might be 'lower fidelity'; now a minority view.
Agreed with Jamie's points above.
A couple of additional points:
EA Survey | General population | Gap | |
Personal contact | 16% | 4% | -12% |
80,000 Hours | 13% | 0% | -13% |
Book, article or blog post | 9% | 22% | 13% |
LessWrong | 8% | 0% | -8% |
I don't remember | 8% | 35% | 27% |
EA group | 8% | 0% | -7% |
Podcast | 7% | 7% | 0% |
SSC | 5% | 1% | -5% |
TED Talk | 5% | 1% | -4% |
GiveWell | 3% | 0% | -2% |
Education | 2% | 30% | 28% |
GWWC | 2% | 0% | -2% |
1% | 0% | -1% | |
Search engine | 1% | 0% | -1% |
Vox | 1% | 0% | -1% |
OFTW | 1% | 0% | -1% |
Swiss group | 1% | 0% | -1% |
TLYCS | 0% | 0% | 0% |
EAG/EAGx | 0% | 0% | 0% |
ACE | 0% | 0% | 0% |
Other | 9% | 0% | -9% |
We'd be happy to repeat either of these surveys if people are interested.
The results are directionally similar if you exclude those who "Don't remember", which is much more common for the general population.
Thanks Ben!
I think we should be able to do this[1] and I agree it seems interesting. It looks like Google has quite high favourability and Bill Gates personally (I did not see recent numbers for the Gates Foundation), reasonably high, but lower.
Though there is some complication around whether we're looking at the sentiment of (i) people who've been assessed as truly being aware (most comparable to our attitudes among those who have already heard question, but would need an additional test for those respondents), (ii) people who've not claimed awareness (most comparable to our attitudes among those who have not heard question but less relevant).
Nice idea!
I tried this in a recent conversation of mine and it opted to write about data analysis instead. It's interesting, but not surprising, that what it was interested in writing about varied so much based on the preceding conversation.
Your data reveals something fascinating: "find the most effective ways" outperforms "ways to maximize your impact" by about 0.35 points, despite both essentially describing optimization. This gap hints at something deeper about how we process language and ideas.I think "most effective" works better because it's inherently comparative and concrete. It implies there's a landscape of options that can be ranked, studied, and chosen between. It suggests a search process, almost scientific in nature. "Maximize your impact," while mathematically similar, feels more abstract - it's what you do after you've already chosen your path, not the exciting process of discovery itself.
There's also something about the word "find" that matters. It frames the reader as an explorer or researcher, not just an executor. Compare "find the most effective ways" to a hypothetical "use the most effective ways" - I suspect the former would still win. The discovery framing makes people feel like investigators rather than followers.
This pattern appears everywhere. Consider how "evidence-based medicine" caught on while "scientific medicine" didn't, or how "growth mindset" spread faster than "belief in malleable intelligence." The winning framings tend to be those that make abstract concepts feel tangible and position the audience as active participants rather than passive recipients.
The most successful framings seem to offer both intellectual clarity and emotional resonance - they make you feel smart for understanding them while also making you feel capable of action. "Most effective" does both: it's precise enough to feel rigorous but accessible enough to feel achievable.
Thanks Micaella!
We'll give some thought to the best way to approach this. One complicating factor is that the LMICs are very heterogeneous (e.g. we might expect substantial differences between the results for Brazil, China, and Afghanistan).
Overall, before looking into this more, I can note that all LMICs make up about 7% of the total sample, and about half of these are Brazil, India, China and the Philippines.
I find it helpful to distinguish two things, one which I think EA is doing too much of and one which EA is doing too little of: