[epistemic status: articulation of a position I kind of believe and think is under-articulated, but am unsure of the strength of]
I think EA has a lot of great ideas. I wish more people in the world deeply understood them, and took ~EA principles seriously. I'm very into people studying the bodies of knowledge that EA has produced, and finding friends and mentors in the ecosystem.
But I also think that EA is still a tiny corner of the world, and that there's a lot of important networks and knowledge beyond it. When I think about optimal allocation of people who are bought into EA, I want quite a lot of those people to go out and interact with different systems in the world, different peer groups; and learn from them, make connections.
In principle this should be pretty accessible. Except I worry about our implicit social structures sending the message "all the cool people hang around the centrally EA spaces" in a way that doesn't really support people to actually go and do these exploring moves while being engaged in and encouraged by EA.
I think that this is one of the (if not the) most important problems to fix in EA messaging / status-granting.[1] Note that I don't think we want to slow down people coming in to the EA bubble -- I think it's often healthy and good for people to get up to speed on a lot of stuff, to give them better context for subsequent decisions. So the challenge is to encourage people to graduate to exploring without making exploring itself so high-status that people jump directly there without learning the cool stuff that EA has to offer first.
What could we do about it? Some options:
- Encourage a narrative something like "when your EA learning slows down, that's often the time to dive back into the wider world"
- Celebrate people who follow this trajectory
- Make sure that community support structures are helpful and functional for people who have a lot of EA knowledge but are now exploring rather than "full time EA professionals"
I'd be keen to see fleshed out versions of these, or other ideas.
Absent good fixes here, I'm inclined to celebrate a certain amount of EA disillusionment: it seems important that a fraction of super talented people go and explore different areas, and if that's easier to access given disillusionment with EA then so much the worse for people's good opinions of EA. But this seems worse if something else could work, because of bad feeling, and making it harder for people to stop exploring mode and start working with the core of the community when that's correct.
N.B. I'm making a directional claim here. Of course it's quite possible to imagine getting to a stage where too many people go and explore, evaporating the pool of people trying to work on the most crucial things. What would be too much exploration? My guess is that in equilibrium the ideal might be between 10% and 20% of the people who are sufficiently skilled up to do really important work in the core should be exploring instead. And a larger group around them who can't yet find crucial work in the core (but hope to some day) should also do this. But I don't put that much stock in my numbers; I'm interested in takes from people who would go higher or lower.
- ^
Another candidate: wanting people who can think for themselves, but granting social status to people who appear to come to the same conclusions as leadership.
This seems like a helpful sentiment for the community. To share a personal experience, my first full-time job wasn’t in EA, and I’m pretty glad I did it.
I worked at an early stage startup. There were lots of opportunities to take on responsibility and learn from experience by talking with customers and investors, hiring new people, and shipping software. The founders were much more experienced and accomplished than I was, and seeing how they worked taught me a lot. I’m of the opinion that work performance is heavy tailed, and finding people far more capable than me was really helpful for showing me what’s possible and how to get there.
The impact of the startup was mildly positive in my opinion, but the main value for me was educational. I stayed engaged with EA the entire time via friends, EA Global, and honestly mainly this website. My employer matched some of my donations, and I was able to work on some AI alignment-adjacent stuff (reducing racial bias in student loan approvals!), both of which helped me feel more motivated. I do regret staying too long: I stayed nearly three full years, which was more than I probably needed, but now I’m happily back in EA-land working on technical AI safety.
On the current margin I would encourage more young people to work at high performance organizations for short periods of time to learn how the sausage is made. You can stay in touch with EA and come back later with more skills and experience for direct work.