This is a special post for quick takes by Eli_Nathan. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

CEA is hiring for someone to lead the EA Global program. CEA's three flagship EAG conferences facilitate tens of thousands of highly impactful connections each year that help people build professional relationships, apply for jobs, and make other critical career decisions.

This is a role that comes with a large amount of autonomy, and one that plays a key role in shaping a key piece of the effective altruism community’s landscape. 

See more details and apply here!

We’re very excited to announce the following speakers for EA Global: London 2024:

  • Rory Stewart (Former MP, Host of The Rest is Politics podcast and Senior Advisor to GiveDirectly) on obstacles and opportunities in making aid agencies more effective.
  • Mary Phuong (Research Scientist at DeepMind) on dangerous capability evaluations and responsible scaling.
  • Mahi Klosterhalfen (CEO of the Albert Schweitzer Foundation) on combining interventions for maximum impact in farmed animal welfare.

Applications close 19 May. Apply here and find more details on our website, you can also email the EA Global team at hello@eaglobal.org if you have any questions.

Applications are still open for upcoming EA Global conferences in 2024!

• EA Global: London (31 May–2 June) | Application deadline is in ~6 weeks

• EA Global: Boston (1–3 November)

Apply here and find more details on our website, you can also email the team at hello@eaglobal.org if you have any questions.


 

Want to help CEA improve Swapcard? Swapcard is the networking and scheduling app currently used for EA Global and EAGx events, and their team are asking users for input on new features.

Two of these features are commonly requested by our attendees — calendar synchronization and automatically blocking your schedule if you RSVP for a session.

You can vote for these features to be added or give other feedback at the links below:

Entire Swapcard product roadmap: https://swapcard.notion.site/3ed5acc763e54ce2ad07e7563c0ee9c3?v=3f3b9d52f317449c828e4d4790dbf94d

Calendar synchronization: https://swapcard.notion.site/External-Calendar-Synchronization-a62d85808c45454ca620cdddfa46f791

Automatic schedule blocking: https://swapcard.notion.site/Handle-better-overlapping-of-the-meeting-time-slots-and-sessions-when-requesting-a-meeting-or-bookma-1771ae5e2e1348e981754a6f39913df5
 

Applications for EA Global: Bay Area 2024 (Global Catastrophic Risks) are still open and close on January 21, at 11:59 pm PT (apply here)!

We’re excited to be hosting our first EA Global focussed on global catastrophic risks (GCRs). We'll be welcoming up to 1000 attendees at the Oakland Marriott City Center and platforming high-quality content related to GCRs, including AI safety, biorisks, nuclear security, and more.

We have limited travel funding available. More information can be found on the event page and EA Global FAQ. If you have any questions, please email us at hello@eaglobal.org

Applications for EAG Boston are still open (here), and our early bird registration deadline is on August 4th! If you were accepted to EAG Bay Area or London this year, you can register directly within our portal and won't need to apply again.

You can view our other events on our website, including EAGxNYC, EAGxBerlin, EAGxAustralia, and EAGxPhilippines.

If you have any questions, you can reach out to the team at hello@eaglobal.org.

Curated and popular this week
 ·  · 16m read
 · 
At the last EAG Bay Area, I gave a workshop on navigating a difficult job market, which I repeated days ago at EAG London. A few people have asked for my notes and slides, so I’ve decided to share them here.  This is the slide deck I used.   Below is a low-effort loose transcript, minus the interactive bits (you can see these on the slides in the form of reflection and discussion prompts with a timer). In my opinion, some interactive elements were rushed because I stubbornly wanted to pack too much into the session. If you’re going to re-use them, I recommend you allow for more time than I did if you can (and if you can’t, I empathise with the struggle of making difficult trade-offs due to time constraints).  One of the benefits of written communication over spoken communication is that you can be very precise and comprehensive. I’m sorry that those benefits are wasted on this post. Ideally, I’d have turned my speaker notes from the session into a more nuanced written post that would include a hundred extra points that I wanted to make and caveats that I wanted to add. Unfortunately, I’m a busy person, and I’ve come to accept that such a post will never exist. So I’m sharing this instead as a MVP that I believe can still be valuable –certainly more valuable than nothing!  Introduction 80,000 Hours’ whole thing is asking: Have you considered using your career to have an impact? As an advisor, I now speak with lots of people who have indeed considered it and very much want it – they don't need persuading. What they need is help navigating a tough job market. I want to use this session to spread some messages I keep repeating in these calls and create common knowledge about the job landscape.  But first, a couple of caveats: 1. Oh my, I wonder if volunteering to run this session was a terrible idea. Giving advice to one person is difficult; giving advice to many people simultaneously is impossible. You all have different skill sets, are at different points in
 ·  · 32m read
 · 
Authors: Joel McGuire (analysis, drafts) and Lily Ottinger (editing)  Formosa: Fulcrum of the Future? An invasion of Taiwan is uncomfortably likely and potentially catastrophic. We should research better ways to avoid it.   TLDR: I forecast that an invasion of Taiwan increases all the anthropogenic risks by ~1.5% (percentage points) of a catastrophe killing 10% or more of the population by 2100 (nuclear risk by 0.9%, AI + Biorisk by 0.6%). This would imply it constitutes a sizable share of the total catastrophic risk burden expected over the rest of this century by skilled and knowledgeable forecasters (8% of the total risk of 20% according to domain experts and 17% of the total risk of 9% according to superforecasters). I think this means that we should research ways to cost-effectively decrease the likelihood that China invades Taiwan. This could mean exploring the prospect of advocating that Taiwan increase its deterrence by investing in cheap but lethal weapons platforms like mines, first-person view drones, or signaling that mobilized reserves would resist an invasion. Disclaimer I read about and forecast on topics related to conflict as a hobby (4th out of 3,909 on the Metaculus Ukraine conflict forecasting competition, 73 out of 42,326 in general on Metaculus), but I claim no expertise on the topic. I probably spent something like ~40 hours on this over the course of a few months. Some of the numbers I use may be slightly outdated, but this is one of those things that if I kept fiddling with it I'd never publish it.  Acknowledgements: I heartily thank Lily Ottinger, Jeremy Garrison, Maggie Moss and my sister for providing valuable feedback on previous drafts. Part 0: Background The Chinese Civil War (1927–1949) ended with the victorious communists establishing the People's Republic of China (PRC) on the mainland. The defeated Kuomintang (KMT[1]) retreated to Taiwan in 1949 and formed the Republic of China (ROC). A dictatorship during the cold war, T
 ·  · 14m read
 · 
This is a transcript of my opening talk at EA Global: London 2025. In my talk, I challenge the misconception that EA is populated by “cold, uncaring, spreadsheet-obsessed robots” and explain how EA principles serve as tools for putting compassion into practice, translating our feelings about the world's problems into effective action. Key points:  * Most people involved in EA are here because of their feelings, not despite them. Many of us are driven by emotions like anger about neglected global health needs, sadness about animal suffering, or fear about AI risks. What distinguishes us as a community isn't that we don't feel; it's that we don't stop at feeling — we act. Two examples: * When USAID cuts threatened critical health programs, GiveWell mobilized $24 million in emergency funding within weeks. * People from the EA ecosystem spotted AI risks years ahead of the mainstream and pioneered funding for the field starting in 2015, helping transform AI safety from a fringe concern into a thriving research field. * We don't make spreadsheets because we lack care. We make them because we care deeply. In the face of tremendous suffering, prioritization helps us take decisive, thoughtful action instead of freezing or leaving impact on the table. * Surveys show that personal connections are the most common way that people first discover EA. When we share our own stories — explaining not just what we do but why it matters to us emotionally — we help others see that EA offers a concrete way to turn their compassion into meaningful impact. You can also watch my full talk on YouTube. ---------------------------------------- One year ago, I stood on this stage as the new CEO of the Centre for Effective Altruism to talk about the journey effective altruism is on. Among other key messages, my talk made this point: if we want to get to where we want to go, we need to be better at telling our own stories rather than leaving that to critics and commentators. Since