Hide table of contents

I was recently listening to a podcast discussion that included two people that had been involved in military special operations units -- one in the Navy SEALs and one in the US Army Special Forces. I was struck by their extremely high level of training, dedication, commitment, and overall ability -- but also by also how this had in large part been squandered on fighting a destructive and unproductive war in Afghanistan, supporting questionable CIA operations, and so on.

It occurs to me that people in the EA community are often working on much more important causes, but with far less training, commitment, personal skill, etc.

This leads to my question -- what would it look like if similar levels of training effort, funding, selection, etc. were going into preparing EAs to do as much good in the world as possible as is currently going into preparing elite military units? (I don't think that this would necessarily look much like elite military training, to be clear!)

If this first exercise comes up with anything that seems promising -- are there ways that we could potentially 80/20 this and get much of the goods without high costs?

30

0
0

Reactions

0
0
New Answer
New Comment


4 Answers sorted by

I think this will vary a lot depending on what kind of work you're aiming to do, but I could imagine a training programme for e.g. promising young grantmakers being very helpful

Charity Entrepreneurship tries to do this for entrepreneurs

Adding to that, Lucia Coulter of the Lead Exposure Elimination Project had high praise for Charity Entrepreneurship when I interviewed her:

Charity Entrepreneurship has [...] made a big difference – their support from the incubation program to now has helped with pretty much every aspect of our work. [...] Firstly they provided a two-month full-time incubation program, which I went through (remotely) in the summer 2020. This was where I decided to work on lead exposure (which was an idea researched and recommended by Charity Entrepreneurship), where I paired up with my co-founder Jack, and from where we received our initial seed grant. During the program we learnt a huge amount of extremely relevant and practical material – for example, how to make a cost-effectiveness analysis, how to make a six-month plan, how to develop a monitoring and evaluation strategy, how to hire, and a lot more. Since then Charity Entrepreneurship has provided LEEP with weekly mentoring and wider support through the community of staff, previous incubatees, and advisors. I highly recommend checking out the Charity Entrepreneurship incubation program if anyone is interested!

(emphasis mine; source)

Good point re: Charity Entrepreneurship.

I'm somewhat more skeptical of the grantmaking thing though because there are few enough positions that it is not very legible who is good at it, whether others currently outside the field could do better, etc.

I could be wrong -- I can point to specific things from some grantmakers that I thought were particularly good, for instance -- but it doesn't feel to me that it's the most amenable field for such a program. 

(Note that this is low-confidence and I could be wrong -- if there are more objective grantmaking skill metrics somewhere I'd be very interested to see more!)

5
Kirsten
Some trainable things I think would help with grantmaking: -knowledge of the field you're making grants in -making a simple model to predict the expected value of a grant (looking for a theory of change, forecasting the probability of different steps, identifying the range of possible outcomes) -best practices for identifying early signs a grant won't be worth funding, to save time, without being super biased against people you don't know or from a different background to you who eventually could do good work -giving quality feedback to successful and unsuccessful applicants -engaging with donors (writing up summaries of why you gave different grants, talking to people who are considering donating through your fund) -evaluating your grants to learn how closely what really happened matched your model It doesn't seem to me obviously less trainable then being a Navy seal

There's the CFAR workshop, but it's just a 4 day program. (Though it would take longer to read all of Yudkowsky's writing.)

I'm no expert, but in some plausible reading, US Military training is primarily about cultivating obedience and conformity. Of course some degree of physical conditioning is genuinely beneficial, but when's the last time a Navy Seal got into a fist fight?

For most of the EA work that needs to get done (at the moment), having an army of replaceable, high-discipline, drones is not actually that useful. A lot of the movement hinges on a relatively small number of people acting with integrity, and thinking creatively.

Instead of intense training processes, EA at the moment relies on a really intense selection process. So the people who end up working in EA orgs have mostly already taught themselves the requisite discipline, work ethic and so on.

My impression is that the people who end up working in EA organizations are not on the same tier of discipline, work ethic, commitment, etc. as elite military forces and are not really even very close?

I don't say that to disparage EA direct workers, I'm involved in direct work myself  -- but my sense is that much more is possible. That said, as you mention the amount of discipline needed may simply not be as high.

4
AppliedDivinityStudies
Yeah again, for highly creative intellectual labor on multi-decade timescale, I'm not really convinced that working super hard or having no personal life or whatever is actually helpful. But I might be fooling myself since this view is very self-serving.

I used to listen to the podcast of a former Navy SEAL and he argues that the idea of obedient drones is totally off for SEALs, and I got the impression they learn a lot of specialized skills for strategic warfare stuff. Here an article he wrote about this (haven’t read it myself): https://www.businessinsider.com/navy-seal-jocko-willink-debunks-military-blind-obedience-2018-6

I recently learned about Training for Good, a Charity Entrepreneurship-incubated project, which seems to address some of these problems. They might be worth checking out.

I think this is a great exercise to think about, especially in light of somewhat-recent discussion on how competitive jobs at EA orgs are. There seems to be plenty of room for more people working on EA projects, and I agree that it’s probably good to fill that opportunity. Some loose thoughts:

There seem to be two basic ways of getting skilled people working on EA cause areas:
1.  Selectively recruiting people who already have skills.
2. Recruiting promising people who might not yet have needed skills and train them. 

Individual organizations can choose both options, depending on their level of resources. But if most organizations choose option 1, the EA community might be underutilizing its potential pool of human resources. So we might want the community in general to use option 2, so that everyone who wants to be involved with EA can have a role—even if individual EA organizations still choose option 1. For this to happen, the EA community would probably need a program whereby motivated people can choose a skillset to learn, are taught that skillset, and are matched with a job at the end of the process. 

Currently, motivated people who don’t yet possess skills are placed into a jumble of 1-on-1 conversations, 80k advising calls, and fellowship and internship listings. Having those calls and filling out internship and fellowship applications takes a ton of time and mental energy, and might leave people more confused than they were initially. A well-run training program could eliminate many of these inefficiencies and reduce the risk that interested people won’t be able to find a job in EA. 

We can roughly rank skill-building methods by the number of people they reach (“scale”), and the depth of training that they provide. In the list below, “high depth” skill development could lead to being hired for that skill (when one would not have been hired for that skill otherwise), “medium depth” as warranting a promotion or increase in seniority level, and “low depth” as an enhancement of knowledge that can help someone perform their job better, but probably won’t lead to new positions or higher status.

  • Internal development within organizations, like Aaron Gertler mentioned (small scale, medium depth)
  • Internship/fellowship programs (medium scale, medium depth)
  • One-off workshops and lectures (small scale, low depth)
  • Cause area-specific fellowships, like EA Cambridge's AGI Safety Fellowship (large scale, low depth) 
  • A training program like the one I described above (large scale, high depth)
  • An EA university, as proposed here (large scale, high depth)

If we choose option 2, we probably want large scale, high depth ways to train people. I’m interested in hearing people’s thoughts on whether this is a good way to evaluate skill-building methods.

One caveat: there’s a lot more interest in working for the military than there is in working for EA orgs. Since this interest already exists, the military just needs to capitalize on it (although they still spend lots of money on recruitment ads and programs like ROTC). The EA community doesn’t even have great name recognition, so it’s probably premature to assume that we’d have waves of people signing up for such a training program—but it’s possible that we could get to that point with time.

Rethink Priorities has an internship program.

If I recall correctly, they got 700 applications in the first round so there could be a lot more funding for internship programs which have space to take that funding.

Comments4
Sorted by Click to highlight new comments since:

I recently read Can't Hurt Me by David Goggins, as well as Living with a SEAL about him, and found both pretty appealing. Also wondered whether EA could learn anything from this approach, and am pretty sure that this is indeed the case, at least for a subset of people. There is surely also some risk of his "no bullshit / total honesty / never quit" attitude to be very detrimental to some, but I assume it can be quite helpful for others.

In a way, CFAR workshops seem to go in a similar-ish direction, don't they? Just much more compressed. So one hypothetical option to think about would be to consider scaling it up to a multi-month program, for highly ambitious & driven people who prioritize maximizing their impact to an unusually high degree. Thinking about it, this does indeed sound at least somewhat like what Charity Entrepeneurship is doing. Although it's a pretty particular and rather "object-level" approach, so I can imagine having some alternatives that require similarly high levels of commitment but have a different focus could indeed be very valuable.

I think a related question is:

"How much less effective would a project have to be for it to be worth it in terms of possible effectiveness and training value?"

ie Is it worth there being moonshot with lower EV than say GiveDirectly, which might find great leaders?

This seems related to Ben Todd's recent comment that EA has a leadership bottleneck. If true, why is training more leaders not a top priority? Maybe I'm misunderstanding something. https://twitter.com/ben_j_todd/status/1423318856622346249

What makes you think it isn't a top priority to train more leaders?

Put another way, what is your current impression of EA's "top priorities", on a community building / professional development level?

(CEA is keen on giving people opportunities to run projects, they pay for books and other resources on professional development, and overall seem to care a lot about helping staff prepare for future leadership if they want to do so. I'd guess that Open Phil and other longstanding orgs are similar?)

Curated and popular this week
 ·  · 22m read
 · 
The cause prioritization landscape in EA is changing. Prominent groups have shut down, others have been founded, and everyone’s trying to figure out how to prepare for AI. This is the third in a series of posts critically examining the state of cause prioritization and strategies for moving forward. Executive Summary * An increasingly common argument is that we should prioritize work in AI over work in other cause areas (e.g. farmed animal welfare, reducing nuclear risks) because the impending AI revolution undermines the value of working in those other areas. * We consider three versions of the argument: * Aligned superintelligent AI will solve many of the problems that we currently face in other cause areas. * Misaligned AI will be so disastrous that none of the existing problems will matter because we’ll all be dead or worse. * AI will be so disruptive that our current theories of change will all be obsolete, so the best thing to do is wait, build resources, and reformulate plans until after the AI revolution. * We identify some key cruxes of these arguments, and present reasons to be skeptical of them. A more direct case needs to be made for these cruxes before we rely on them in making important cause prioritization decisions. * Even on short timelines, the AI transition may be a protracted and patchy process, leaving many opportunities to act on longer timelines. * Work in other cause areas will often make essential contributions to the AI transition going well. * Projects that require cultural, social, and legal changes for success, and projects where opposing sides will both benefit from AI, will be more resistant to being solved by AI. * Many of the reasons why AI might undermine projects in other cause areas (e.g. its unpredictable and destabilizing effects) would seem to undermine lots of work on AI as well. * While an impending AI revolution should affect how we approach and prioritize non-AI (and AI) projects, doing this wisel
 ·  · 9m read
 · 
This is Part 1 of a multi-part series, shared as part of Career Conversations Week. The views expressed here are my own and don't reflect those of my employer. TL;DR: Building an EA-aligned career starting from an LMIC comes with specific challenges that shaped how I think about career planning, especially around constraints: * Everyone has their own "passport"—some structural limitation that affects their career more than their abilities. The key is recognizing these constraints exist for everyone, just in different forms. Reframing these from "unfair barriers" to "data about my specific career path" has helped me a lot. * When pursuing an ideal career path, it's easy to fixate on what should be possible rather than what actually is. But those idealized paths often require circumstances you don't have—whether personal (e.g., visa status, financial safety net) or external (e.g., your dream org hiring, or a stable funding landscape). It might be helpful to view the paths that work within your actual constraints as your only real options, at least for now. * Adversity Quotient matters. When you're working on problems that may take years to show real progress, the ability to stick around when the work is tedious becomes a comparative advantage. Introduction Hi, I'm Rika. I was born and raised in the Philippines and now work on hiring and recruiting at the Centre for Effective Altruism in the UK. This post might be helpful for anyone navigating the gap between ambition and constraint—whether facing visa barriers, repeated setbacks, or a lack of role models from similar backgrounds. Hearing stories from people facing similar constraints helped me feel less alone during difficult times. I hope this does the same for someone else, and that you'll find lessons relevant to your own situation. It's also for those curious about EA career paths from low- and middle-income countries—stories that I feel are rarely shared. I can only speak to my own experience, but I hop
 ·  · 6m read
 · 
I am writing this to reflect on my experience interning with the Fish Welfare Initiative, and to provide my thoughts on why more students looking to build EA experience should do something similar.  Back in October, I cold-emailed the Fish Welfare Initiative (FWI) with my resume and a short cover letter expressing interest in an unpaid in-person internship in the summer of 2025. I figured I had a better chance of getting an internship by building my own door than competing with hundreds of others to squeeze through an existing door, and the opportunity to travel to India carried strong appeal. Haven, the Executive Director of FWI, set up a call with me that mostly consisted of him listing all the challenges of living in rural India — 110° F temperatures, electricity outages, lack of entertainment… When I didn’t seem deterred, he offered me an internship.  I stayed with FWI for one month. By rotating through the different teams, I completed a wide range of tasks:  * Made ~20 visits to fish farms * Wrote a recommendation on next steps for FWI’s stunning project * Conducted data analysis in Python on the efficacy of the Alliance for Responsible Aquaculture’s corrective actions * Received training in water quality testing methods * Created charts in Tableau for a webinar presentation * Brainstormed and implemented office improvements  I wasn’t able to drive myself around in India, so I rode on the back of a coworker’s motorbike to commute. FWI provided me with my own bedroom in a company-owned flat. Sometimes Haven and I would cook together at the residence, talking for hours over a chopping board and our metal plates about war, family, or effective altruism. Other times I would eat at restaurants or street food booths with my Indian coworkers. Excluding flights, I spent less than $100 USD in total. I covered all costs, including international transportation, through the Summer in South Asia Fellowship, which provides funding for University of Michigan under