Hide table of contents

TLDR: Asking for hires to be acquainted with EA ideas or to demonstrate how EA they are limits the talent pool where it is already scarce. Looking outside of EA has challenges but can be a net positive in some circumstances.


Throughout my involvement in the EA world, I see that one of the biggest challenges facing effective altruism is attracting the right people with the right skills to execute its many promising projects. I've witnessed firsthand in my work in both China and Serbia how a lack of skilled personnel can hinder an organization’s progress.

Take EA Serbia as an example. As the EA movement is relatively new in Serbia, it’s been hard to attract members and move them inward through the concentric circles. We decided to hire two part-time employees (one focused on marketing & social media, the second focused on outreach and developing external relations) who were completely outside of EA but who had relevant professional experience with the charity world. We asked them to take the EA Serbia Introductory Course (our own version of the Introductory EA Program) during their first month at work as part of their onboarding. These people brought their expertise in their professional fields, which made a huge difference in our outreach efforts, and which also established a solid foundation for future EA efforts in Serbia.

I am not the first one to write about this. There have been a variety of posts on the EA forum regarding outreach to mid-career professionals with relevant skills, attempts at EA-specialized hiring agencies, and the like. Also, see here for AI Policy talent gaps. People with both relevant skills and a passion for making a difference can be crucial assets. Moreover, by getting involved in EA, they serve as a vector to spread EA ideas to professional networks that otherwise would remain unconnected to EA, allowing us to reach new groups of people (we’ve seen this with one of our hires now spreading EA Ideas in their professional circles).

While there are certainly some roles in which high levels of EA context and knowledge is necessary, I’d encourage us all to question the impulse to say “every hire must be at least 9 out of 10 on the metric of EAness.” For some roles, maybe 6 out of 10 is enough. For some roles, maybe 3 out of 10 is enough. The person designing the programming for EAG probably needs to have a strong understanding of the EA community in order to do their work well, and the translator definitely needs to understand the concepts in order to translate them well. But the person who designs your payroll system or makes reservations for your team offsite probably doesn’t. Do you need your office manager to be an EA, or would it work out fine to simply hire a skilled office manager who gets familiar with EA during their first few months on the job? That will, of course, depend on your organization’s specific circumstances, but you should at least consider that for many roles, a competent professional with several years of experience who is neutral about EA will be able to perform very well on the job.

Key Takeaways


Expertise is important, and it gets things done: 

  • Certain roles demand specific expertise or experience. Sometimes, you really do want an experienced project manager with a PMP certification rather than an organized person who can write well, who ran the EA club at their school, and who has never had a job before. Some skills are easier or harder for a person with no experience to learn and pick up. Consider accounting, marketing, web design, or recruitment; some of these have positive and some of these have negative learning curves. Between candidates Alice and Bob, if Alice has expertise and Bob does not, for many roles, that should probably outweigh the consideration of Bob being “more EA” than Alice. The tax bureau does not care about your mission alignment or if your confidence intervals are properly calibrated; it needs a CPA designation (or whatever your jurisdiction requires) as a proxy for a certain level of professional knowledge and competence. Additionally, EA talent is concentrated in some important ways: we have more people with expertise in software and economics than in pedagogy or international relations, more in research than in human resources (multiple EA orgs have struggled with citizenship and immigration issues, with legal compliance, and with turnover). In some functions, EA as a whole would benefit from extending the recruiting net beyond the EA circle. There has long been talk about how getting skilled managers is challenging for EA orgs, and part of the reason for this is that such a small percentage of the EA community has professional experience as managers.

“Culture fit” assimilation: 

  • When hiring people, fit for the role, for the organization, and for the ‘industry’ are all important. Although integrating individuals from outside the EA community presents challenges, it's possible to assess compatibility through well-designed selection processes (as Charity Entrepreneurship does) and thoughtful analysis. The ideal might be to identify proto EAs (people who would be amenable to the ideas of EA, but simply haven’t encountered EA or need a bit of a nudge to get more involved) in combination with right philosophical beliefs.

Industry specific wisdom: 

  • Hiring individuals who bring diverse ideas and methodologies from other industries can significantly enhance innovation within EA organizations. Drawing on experiences from sectors as varied as robotics, manufacturing, pharmaceuticals, etc. can provide fresh perspectives and innovative solutions to existing challenges. It can also reduce redundancy and failure percentage by adding diversity of thought and experience. For example, even if I am hired for my recruitment experience I also bring diversity of thought by having lived and worked in China, in large and small corporations and with multinational start-ups. Each of those can be useful for my place of work, separately from my main expertise. Hiring an event planner with a decade of experience will involve getting access to that person’s professional network, getting access to suppliers they have worked with before, and getting access to event-specific knowledge that would be relatively inaccessible without that experience (such as industry regulations or professional associates with a bad reputation).


Risk to team cohesion: 

  • Letting in people who are actually not fully aligned with the EA values might have a negative influence on the rest of the team. Good onboarding and integration are crucial here in order to not disturb the team’s dynamic. It may also be unwise to make your first hires non-EA if you plan to hire quickly after that. Every startup founder will tell you that your first several hires are crucial. If a large percentage of your team isn’t on board with EA ideas, you may end up diffusing or changing the culture. Hiring someone who agrees with EA ideas but who simply has them as a smaller part of their life is unlikely to actively hurt the culture, whereas hiring someone who is apathetic or even against EA ideas might likely end up as a net negative, even if they have high levels of relevant professional skills. 

Slow development: 

  • Any new hire is a bet: with limited information about their skills, you try to predict how well they will perform in the role. If you hire someone unfamiliar with EA, it might take more time to educate them on the EA ideas, communication practices, and cultural norms of your team. A competent professional who hasn’t heard of Fermi estimates or scout mindset or double cruxing can quickly grasp the concept, but you will still need to take extra time to introduce the idea the first time they encounter it. Multiply that by a dozen different concepts, and this can potentially slow down the work. This necessitates a balanced approach to onboarding, training, and integration to ensure that productivity is maintained while having the core values intact. Merely sharing a long list of acronyms and terminology with the new hire won’t work well.

EA jobs provide scarce non-monetary goods:

  • If you can hire someone from within the EA community, you are offering them a chance to keep them active and engaged in EA, and denying them that just because you want to explore non-EA talent should be done carefully. I understand and appreciate the desire to help our own, and there is a place for that consideration. However, it should not come at the expense of expertise, and I think that the tradeoff between those I think falls on the expertise side.


Overall, the dilemma here is balancing skills and values alignment within Effective Altruism. The deficit of experienced professionals leads to numerous EA professionals assuming multiple roles due to a shortage of specialist talent, which may impede an organization’s progress. Considering the uncertainty of whether external recruits will adapt to EA principles, a strategic alternative might be to offer introductory fellowships (AI safety fellowship or other fields’ focused programs) to senior professionals. This approach would allow for identifying and intensively supporting those most receptive to a career shift, thereby optimizing impact.

Moreover, hiring someone new to EA in order to get specific expertise is beneficial, but the hiring and selection process should be approached differently than is sometimes done. With the right selection criteria questions, posting job openings more broadly outside of the EA community, consideration of previous professional background, and an eye for the value fit, these hires can bring a lot of expected value to the org and the community.

These are some rough thoughts from my experience but I'm eager to hear from others in the EA community who have hired externally. What positives and negatives have you encountered?"

Special thanks to @gergo @Joseph Alcantara @Dušan D. Nešić (Dushan) @Nina Friedrich  for comments on an earlier draft.





More posts like this

Sorted by Click to highlight new comments since:

EA, being a fallible movement, is wrong about a lot of things. A lot of people that are not aligned with EA have completely valid reasons for not doing so. If you excessively filter for people that already agree with you on everything, you risk creating a groupthink atmosphere where alternative ideas have no real way to enter the discourse. Take this to the extreme and you pretty much end up with a cult. 

Also, EA is not representative of the general public, and thus will have a hard time knowing how ideas and policies are received by or impact broader demographics. Having normal people around to provide sanity checks is a useful byproduct of hiring more generally, rather than finding people already adjacent to this very small and odd subculture. 

Very strong agree. The 'cons' in the above list are not clearly negatives from an overall view of 'make sure we actually do the most good, and don't fall into epistemic echo chambers' perspective.

One thing I think is often missing from these sorts of conversations is that "alignment with EA" and "alignment with my organization's mission" are not the same thing! It's a mistake to assume that the only people who understand and believe in your organization’s mission are members of the effective altruism community. EA ideas don’t have to come in a complete package. People can believe that one organization’s mission is really valuable and important, for different reasons, coming from totally different values, and without also believing that a bunch of other EA organizations are similarly valuable.

For "core EA" orgs like the Centre for Effective Altruism[1], there's probably near-total overlap between these two things. But for lots of other organizations the overlap is only incidental, and what you should really be looking for is "alignment with my organization's mission". Perceived EA Alignment is an unpredictable measure of that, while also being correlated with a bunch of other things like culture, thinking style, network, and socioeconomic status, each of which you either don't care about or which you don't want to be selecting for in the first place.

  1. ^

I think that is a good point, and I wish that we had included this in the post!

We approached this mainly from the perspective of a community building work (Tatiana's main work), which as a meta-EA job is probably the only type of work for which there is such high overlap between "alignment with EA" and "alignment with my organization's mission." But you are correct. I can see how there would be a lot less overlap for an organization focused on a specific cause.

I think you've missed the main con and this is quite a subtle disadvantage that would only arise over longer periods of time.

Hiring people who aren't aligned in terms of values can exert subtle pressure to drift toward the mainstream over time. I know some people are going to say something along the lines of "why should we trust ourselves over other people?" and my answer is that if you don't have a particularly high regard for EA, you should go find a group that you do have particularly high regard for and support their work instead. Life's too short to waste on a group you find to be a bit "meh" and there are a lot of different groups out there.

Titoal argues that we should "have normal people around to provide sanity checks". I agree that it is important to try to not get too caught up in the EA bubble and maintain an understanding of how the rest of the world thinks, but I don't see this as outweighing the costs of introducing a high risk of value drift.

There is some merit to the argument that being value-aligned isn't particularly relevant to particular roles, but it's more complex than that because people's roles can change over time. Let's suppose you hire an employee for role X and they apply to shift to role Y, but you deny them vs. an employee who is more value-aligned but less qualified. That's a recipe for internal conflict. In practice, I suspect that there are some roles such as accountant where professional skills matter more and they are more likely to be happy sticking to that particular area. 


Quick case study. We hired a production officer for EAGxUtrecht who is a professional event manager but wasn't even aware of EA as a thing. She's amazing. 

Thank you for sharing and it's great to hear about another successful case! 

A point each in addition on the pro/con side.

Pro: If a non-EA fills a position and can do the function just as well with less alignment, this frees up the EA for a higher EV use of time. The non-EA, not being as concerned about EV, would, on average, choose a less impactful position, because impact is not as high of a concern.

Con: EA positions carry connections, influence, credibility, and other social capital that can allow for impact beyond one's direct job duties. To one who is concerned simply with doing their job duties competently, there may be less incentive to otherwise responsibly use their "insider" position to better the world.  

I'd guess the biggest con is adverse selection. Why is this person accepting a below market wage at a low-conventional-status organization? 

An EA might be the most conventionally talented candidate because they're willing to take the role despite these things.   

Hmm I'd have thought that most EA orgs pay significantly better than the rest of the charity sector, and are competitive with mid-high paying private sector roles? 

I'm pretty confident this is true at a junior level, but is perhaps less so for more senior roles.

Things downstream of OpenPhil are in the 90th+ percentile of charity pay, yes, but why do people work in the charity sector? Either because they believe in the specific thing (i.e. they are EAs) or because they want the warm glow of working for a charity. Non-EA charities offer more warm glow, but maybe there's a corner of "is a charity" and "pays well for a charity even though people in my circles don't get it" that appeals to some. I claim it's not many and EA jobs are hard to discover for the even smaller population of people who have preferences like these and are high competence. 

Junior EA roles sometimes pay better than market alternatives in the short run, but I believe high potential folks will disproportionately track lifetime earnings vs the short run and do something that's better career capital.

I claim it's not many and EA jobs are hard to discover for the even smaller population of people who have preferences like these and are high competence.

On this note, a proactive recruitment approach emphasizes the importance of actively reaching out to potential candidates rather than passively waiting for the right individuals to encounter the job description on the right platform at the right time. I plan to write a follow-up post regarding the recrtuiment "craft" particular in EA field. 

I had not heard of double-cruxing before reading about it above and I think I am an EA - haha! In my mind it suffices to be kind of open-minded and curious and have a strong will to be of service/help to others. Moreover, on team cohesion and culture I am not sure I fit neatly into EA either - I often receive a lot of downvotes here on the forum!

I kind of think EA or non-EA is a pretty long sliding scale and my completely unfounded observation is that there is a lot of talent currently employed or funded that almost tend more towards the non-EA scale. I feel like the "very EA" people might to a large degree be people that are "fans" of EA, engage a lot here on the forum but that might actually not have a high percentage of employment in EA orgs. But I could be wrong here - I have no data to back this up and think one could assert this to some degree from surveys if there are surveys that not only go out to EAs, but to people employed and funded by "EA orgs".

I suspect that you are correct, Ulrik. A lot of EAs (myself included) often informally refer to people being EA as if that is a particular thing, but in reality it is actually an accumulation of dozens of little bits of knowledge, traits, preferences, and professed beliefs (like most identities/cultures). There are plenty of people that are 100% on board with using evidence and reasoning to do the most good that haven't heard of specific terminology that EAs tend to use, or that haven't considered the repugnant conclusion, and in my mind that doesn't make them any less EA.

Maybe we could think of it as the core stuff that really matters, and all the surface-level stuff that happens to come along. Actually trying to do good matters and being intentional about helping others matters. Interest in particular topics or having read specific books just happens to come along.

Executive summary: Hiring non-EA talent for EA organizations has both pros and cons that should be carefully considered, but can provide valuable expertise and diversity of thought when done strategically.

Key points:

  1. EA organizations often struggle to find people with the right skills, and hiring non-EA professionals can bring valuable expertise and get things done.
  2. Non-EA hires can be assimilated into the EA culture through proper onboarding and screening for philosophical compatibility.
  3. Diverse ideas and wisdom from other industries can enhance innovation and reduce redundancy in EA organizations.
  4. Risks include potential harm to team cohesion if new hires are not aligned with EA values, and slower onboarding due to learning EA concepts and norms.
  5. EA jobs provide scarce non-monetary value to community members, so external hiring should be balanced against supporting EAs.
  6. Strategic approaches like fellowships for senior professionals could help identify those most receptive to an EA career shift.



This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
Relevant opportunities