This is a transcript of my opening talk at EA Global: London 2025. In my talk, I challenge the misconception that EA is populated by “cold, uncaring, spreadsheet-obsessed robots” and explain how EA principles serve as tools for putting compassion into practice, translating our feelings about the world's problems into effective action.
Key points:
* Most people involved in EA are here because of their feelings, not despite them. Many of us are driven by emotions like anger about neglected global health needs, sadness about animal suffering, or fear about AI risks. What distinguishes us as a community isn't that we don't feel; it's that we don't stop at feeling — we act. Two examples:
* When USAID cuts threatened critical health programs, GiveWell mobilized $24 million in emergency funding within weeks.
* People from the EA ecosystem spotted AI risks years ahead of the mainstream and pioneered funding for the field starting in 2015, helping transform AI safety from a fringe concern into a thriving research field.
* We don't make spreadsheets because we lack care. We make them because we care deeply. In the face of tremendous suffering, prioritization helps us take decisive, thoughtful action instead of freezing or leaving impact on the table.
* Surveys show that personal connections are the most common way that people first discover EA. When we share our own stories — explaining not just what we do but why it matters to us emotionally — we help others see that EA offers a concrete way to turn their compassion into meaningful impact.
You can also watch my full talk on YouTube.
----------------------------------------
One year ago, I stood on this stage as the new CEO of the Centre for Effective Altruism to talk about the journey effective altruism is on. Among other key messages, my talk made this point: if we want to get to where we want to go, we need to be better at telling our own stories rather than leaving that to critics and commentators. Since
- Has anyone found out where Longview's Nuclear Weapons Policy Fund grants to? I'm having trouble get a good picture of their grants, evaluation process, and of the larger space.
- Related, how do folks think about / evaluate organizations in this space? Like, how do you compare Nuclear Threat Initiative vs Center for Arms Control & Nonproliferation vs Arms Control Association?; the best I've got is a backwards-looking "have there been many/any alums of this organization going into government"?
- Unrelated - https://erictopol.substack.com/p/long-covid-mitochondria-the-big-miss - the author makes the case that the RECOVER Long Covid grants didn't fund worthwhile work and there's unlikely to be more funding. Is this a space for a 'Fast Grants'-type structure or other backstopping?