Hide table of contents

This question grew out of a reaction I had to Rob Wiblin's "Consider a wider range of jobs, paths and problems if you want to improve the long-term future". 

While reading, though, one thing struck me as a question that merits some more attention. I quote:

I suggest paying a bit more respect to the courage or initiative shown by those who choose to figure out their own unique path or otherwise do something different than those around them.

Given the entrepreneurial slant of EA culture, I worry that some people will end up concluding "we should celebrate risk-taking even more than we already do". 

I wonder whether an EA who is encouraged by this might be more likely to underestimate how costly entrepreneurship is, and whether it is in fact becoming more costly to the individual?

New Answer
New Comment

2 Answers sorted by

[Epistemic status: patchwork of unsystematically tracked personal impressions since first encountering EA in 2014 noted down over the course of a work day]

So here's an attempt at partially explaining, from a historical perspective, why it might be getting more difficult to fulfil the necessary conditions to start independent EA projects without burning your"EA" career capital (and why that might have been different in the early days).

This perspective seems important if true because it would imply making more of an effort to update common EA career advice and culture.  

It is only a partial answer, and I am sharing this because I would appreciate other perspectives.


With the increasing establishment of any field, such as EA, the likelihood of success of new projects decreases while the gravitas of failure for first-time founders increases. If you aren't yet part of one of the core groups that provide a safety net for their entrepreneurs, I would build career capital via safer paths first.

Trust-based networks scale badly

Networks provide access to their resources in a mostly trust-based manner. Resources are often allocated in somewhat formalized ways but still heavily reliant on personal references.

Verification of alignment is always costly

The ways in which a trust-based network can grow are limited: either through explicit entry criteria or through recommender-systems. Designing explicit entry criteria that work is hard, so our civilization mostly relies on opaque arbitration systems running on human brains.

EA organizations and thought leaders do an extraordinary job at documenting their thinking. But no matter how well you are aligned, verification will always remain fairly costly.

Core groups are increasingly hard to access

As an EA founder, you have to make verification of your alignment as cheap as possible. That's costly for a resource-strapped project in its starting phase. Thus, you want to get maximally relevant feedback - ideally from the gatekeepers of the network you want to join.

But if you are on the other side of things, somebody who has a lot of "EA capital", and get asked for feedback from someone you barely know, you are unlikely to engage. You don't have enough time to properly vet them and their project. Being even associated with it could suggest you endorse it - and you have better things to do than to create a nuanced write-up explaining your engagement and lessons learned from it.

As a result, the important EA people will not engage with cool new independent project unless the founder has sufficient "EA capital" prior to going "off track".

Losses look worse relative to safe bets

Even with relevant feedback it is difficult to build up a successful project. And if you don't have much status, few will recognize a worthy attempt in case of failure - and least likely the busy important people.

Unless you are already highly skilled or have a lot of resources to bring your project to fruition, you're likely to fail at a stage that doesn't provide much information about your skill-level. In that case, any failure provides at least a little bit of information about you not being good enough (assuming success is not pure luck).

As there are more and more smart youngsters, it gets more and more difficult not to be burnt by failure simply because failure looks worse relative to everyone else who is playing it safe or has succeeded.

EA used to be different because it was young

In the early days of EA, a handful of smart youngsters could start a bunch of projects because everyone knew everyone else. There was a lot of excitement about people exploring. The early entrepreneurs received status just by starting things - even if they weren't immediately successful.

Today, just starting a new project is getting you less and less "EA capital". This is because the network has grown, the space has been carved out. Growth of the network means more unknown people and thus core groups are becoming more prudent about who to extend their trust to. 

This is not a bad thing; it's a sign of maturity. I just worry that it hasn't been made explicit often enough:

You're more likely not to get credit for trying nowadays because it's harder to interact with the key nodes of the network, as they now have to protect themselves more vigilantly from incurring significant opportunity and reputational costs.


If this were the dominant dynamic, entrepreneurship would not be worth going into for most people who currently self-identify as EAs. Especially the younger ones, let's say under 30. Even more so if being part of EA is already considered weird in your support circle. Not having a "proper" job might cost you too much social capital to have a larger impact later on in life (when you'd likely have most of your impact).

Given EA demographics, there only are few highly skilled, wealthy or well-supported people who can afford to resist these incentive structures. Most are better off continuing on the beaten career paths until they have accumulated enough capital to not lose status when taking risky bets - no matter how well calculated.

Thank you for sharing your analysis of what I also see as a major challenge for us to overcome (the challenge of EA entrepreneurship becoming more costly). I agree with many things in your answer, but strongly disagree with the conclusion or 'bottom line'. It seems very bleak, like giving up. Instead I  think we should be creating better systems for mentorship and vetting. There are some initiatives trying to do things in this space, such as Charity Entrepreneurship and the longtermist incubator project. I am also excited about the new management and ... (read more)

This answer brings valuable points, but it rubbed me the wrong way. After thinking about it, I think that it feels partisan because all of your points go in one direction, but just because the question doesn't ask about a tautology, I'd expect there to be competing considerations.

Here are some competing considerations:

- Charity Entrepreneurship now exists, and makes entrepreneurship much, much easier. I think that this effect is stronger than any other effect. Note that they offer a stipend. 
- I think you're confusing selection effects for environment... (read more)

And if you don't have much status, no one will recognize a worthy attempt in case of failure.

This is a fairly harsh indictment of community norms. It directly implies there is nothing different about EA norms in this dimension relative to society at large, which is kind of a problem because there are well-known areas with superior norms; a well conducted trial reflects well on lawyers even when they lose.

Doesn't make it wrong, naturally. But if true, it seems like it would definitely merit specific attention from the group.

I generally think that full-time social entrepreneurship (in the sense of being dependent on contributed income) early in one's career is quite risky and a bad idea for most people no matter what context or community you're talking about. I would say that, if anything, EA has made this proposition seem artificially attractive in recent years because of a) the unusual amount of money it's been able to attract to the cause during its first decade of existence and b) the high profile of a few outlier founders in the community who managed to defy the odds and become very successful. But the fundamental underlying reality is that it's really hard to scale anything without a self-sustaining business model, and without the promise of scale on the other side it's really hard to justify taking risks.

With that being said, I do think that risk-taking is really valuable to the community and EA is unusually well positioned to enable it without forcing founders to incur the kinds of costs you're talking about. One option, as tamgent mentioned in another comment, is to encourage entrepreneurship as a side project to be pursued alongside a job, full-time studies, or other major commitment. After all, that's how GiveWell, Giving What We Can, and 80,000 Hours all got started, and the lack of a single founder on the job full-time at the very beginning certainly didn't harm their growth. Another option, as EA Funds is now encouraging, is to make a point of generously funding time-limited experiments or short-term projects that provide R&D value for the community without necessarily setting back a founder or project manager in their career. Finally, EA funders could seek to form stronger relationships with funders outside of the community that are aligned on specific cause areas or other narrow points of interest to be better referral sources and advocates for projects that expect to require significant funds over an extended period.

But coming back to your core point, I would definitely encourage most EAs to pursue full-time employment outside of the EA community, even if they choose to stay within the social sector broadly. It's a vast, vast world out there, and all too easy to draw a misleading line from EA's genuinely impressive growth and reach to a wild overestimate of the share of relevant opportunities it represents for anyone trying to make the world a better place.

I agree with much of this answer. However, I'm not sure it's the lack of promise of scale that makes projects not get funded, but rather other reasons. I am also excited about EA Funds now encouraging time-limited all-in experiments. 

To clarify, when I wrote "without the promise of scale on the other side it's really hard to justify taking risks," I was talking from the perspective of the founder pouring time and career capital into a project, not a funder deciding whether to fund it.
Ah sorry, I misunderstood
Sorted by Click to highlight new comments since:

Sorry to say I had difficulty parsing what you were trying to say in the post here. 

Thanks for the feedback! I gave it another pass. Is there anything concrete that threw you off or still does? I'd appreciate pointers as I had other people look at it before.

Here are few minor things I think you could modify for clarity:
Replace 'The sentence that made me think it's worth writing up a reaction was:' with 'From the article:'

Also, you repeat yourself at the end. The last two one-sentence paragraphs could just be one paragraph that says:

'Given the entrepreneurial slant of EA culture, I worry that some people will end up concluding "we should celebrate risk-taking even more than we already do".  Isn't dangerous career advice for the average EA?'

More from konrad
Curated and popular this week
Relevant opportunities