Hide table of contents

Followup to Dealing with Network Constraints

Epistemic Status: I spent some time trying to check if Mysterious Old Wizards were important, and reality did not clearly tell me one way or another. But, I still believe it and frequently reference it and figured I should lay out the belief.


Three bottlenecks that the EA community faces – easily mistaken for each other, but with important differences:

Mentorship – People who help you learn skills, design your career, and gain important context about the EA landscape that help you figure out how to apply those skills.

Management – Within a given org or existing hierarchy, someone who figures out what needs doing and who should do it. This can involve mentorship of employees who are either new, or need to train in new skills.

Finally, what I call Mysterious Old Wizards – Those who help awaken people's ambition and agency.

I mention all three concepts to avoid jargon confusion. Mysterious Old Wizards are slightly fungible with mentors and management, but they are not the same thing. But first, let's go over the first two.

Mentorship and Management Bottlenecks

Mentorship and Management are (hopefully) well understood. Right now, my guess is that management is the biggest bottleneck (with mentorship a close second). But this doesn't mean there's any obvious changes to make to our collective strategy.

The people I know of who are best at mentorship are quite busy. As far as I can tell, they are already putting effort into mentoring and managing people. Mentorship and management also both directly trade off against other high value work they could be doing.

There are people with more free time, but those people are also less obviously qualified to mentor people. You can (and probably should) have people across the EA landscape mentoring each other. But, you need to be realistic about how valuable this is, and how much it enables EA to scale.

A top-tier mentor with lots of skills and context can help ensure someone thinks through lots of relevant considerations, or direct them in the most useful ways. A medium-tier mentor is more likely to be misguided about some things, or missing some context.

A newcomer to the field who's just read the obvious blogposts might be able to help a newer-comer learn what to read, but there's going to be a lot of stuff they don't know.

A lot of EA content is subtle and detailed, and easy to accidentally compress into something misleading. (For example, 80k might write a nuanced article saying "You should focus on talent gaps, not funding gaps", but this gets translated into "EA is talent constrained", and then people repeat that phrase without linking to the article, and then many people form an inaccurate belief that EA needs "pretty talented people", rather than "EA needs very specific talents that are missing.")

I think the way to grow mentorship and management capacity involves longterm planning and investment. There isn't free resources lying around we can turn into mentorship/management. You can invest in mentoring people who grow into new mentors later, but it takes awhile.

I think there is room to improve EA mentorship. But it's a fairly delicate problem, that involves re-allocated resources that are currently being spent fairly carefully.

Mysterious Old Wizards

"I'm looking for someone to share in an adventure"

In The Hobbit, Bilbo Baggins wakes up one day to find Gandalf at his door, inviting him on a quest.

Gandalf does not teach Bilbo anything. He doesn't (much) lead the adventuring party, although he bales them out of trouble a few times. Instead his role in the story is to believe in Bilbo when nobody else does, not even himself. He gives Bilbo a bit of a prod, and then Bilbo realizes, mostly on his own, that he is capable of saving lives and outwitting dragons.

In canon Harry Potter, Dumbledore plays a somewhat similar role. In the first five books, Dumbledore doesn't teach Harry much. He doesn't even give him quests. But a couple times a year, he pops in to remind Harry that he cares about Harry and thinks he has potential.

Showing up and Believing in You

Some people seem to be born ambitious and agentic. Or at least, they gain it fairly early on in childhood.

But I know a fair number of people in EA who initially weren't ambitious, and then at some point became so. And anecdotally, a fair number of those people seem to have had some moment when Someone They Respected invited them out to lunch or something, sat them down and said "Hey, what you're working on – it's important. Keep doing it. Dream bigger than you currently are allowing yourself to dream."

This is often accompaniment with some advice or mentorship. But I don't think that's always the active ingredient.

The core elements are:

  • The wizard is someone you respect. They clearly have skills, competence or demonstrated success such that you actually take their judgment more seriously than your own.
  • The wizard voluntarily takes time out of their day to contact you and sit down with you. It might only be for an hour. It's not just that you went to them and asked "do you believe in me?". They proactively did it, which lends it a costly signal of importance.
  • They either tell you that the things you are doing matter and should invest a lot more in doing them. Or, maybe, they tell you you're wasting your talents and should be doing something more important. But, they give you some sense of direction.

Network Bottlenecks

I think all three types of are in short supply, and we have a limited capacity to grow the resource. But one nice thing about mysterious old wizards is that they don't have to spend much time. Mentorship and management requires ongoing investment. Mysterious Old Wizards mostly make you go off and do the work yourself.

In my model, you can only mysterious-old-wizard for people who respect you a lot. I wouldn't go around trying to do it frivolously. It ruins the signal if it turns into a formality that people expect you to do. But, I do think people should be doing it more on the margin.

Comments5


Sorted by Click to highlight new comments since:

My suggestion for people who are best at mentorship but are quite busy is for organizations where they work to grant them time for mentorship, say 1-2 hours per week for one month, per mentee. That way, they can have a considerably "robust" and effective conversations and if the said mentee wants further time with them, that'll be at their discretion. This can be planned and scheduled well by the organisation.

I want to strongly recommend that every EA-aligned organization should provide or have a mentorship program that allows their staff who wants to be a mentor to be able to provide direction and guidance on the cause areas of their interest, their areas of expertise and competence. This is not compulsory but necessary.

The people I know of who are best at mentorship are quite busy. As far as I can tell, they are already putting effort into mentoring and managing people. Mentorship and management also both directly trade off against other high value work they could be doing.

There are people with more free time, but those people are also less obviously qualified to mentor people. You can (and probably should) have people across the EA landscape mentoring each other. But, you need to be realistic about how valuable this is, and how much it enables EA to scale.

Slight push back here in that I've seen plenty of folks who make good mentors but who wouldn't be doing a lot of mentoring if not for systems in place to make that happen (because they stop doing it once they aren't within whatever system was supporting their mentoring), which makes me think there's a large supply of good mentors who just aren't connected in ways that help them match with people to mentor.

This suggests a lot of the difficulty with having enough mentorship is that the best mentors need to not only be good at mentoring but also be good at starting the mentorship relationship. Plenty of people, it seems though, can be good mentors if someone does the matching part for them and creates the context between them and the mentees.

That is helpful, thanks. I've been sitting on this post for years and published it yesterday while thinking generally about "okay, but what do we do about the mentorship bottleneck? how much free energy is there?", and "make sure that starting-mentorship is frictionless" seems like an obvious mechanism to improve things.

This seems like a useful concept to have.

FWIW, I think something akin to a mysterious old wizard was relevant in my EA-aligned career journey. 

The way I've been phrasing it is that, once I got clear indications that I was likely to be offered a research role at an EA org (Convergence Analysis), I felt like I'd gotten a "stamp of approval" saying it now made sense for me to make independent posts to the Forum and LessWrong as well. I still felt uncertain about whether I'd have anything to say that was worth reading and wasn't just reinventing the wheel, whether I'd say it well, whether people would care, etc., but I felt much less uncertain than I had just before that point.[1] 

So maybe regular, formal job/project application processes already do a lot of the work we'd otherwise want mysterious old wizards to do? 

But I still think there's room for mysterious old wizards, as you suggest. I've tried to fill a mild (i.e., caveated) version of this role for a couple people myself.

[1] My data point is a bit murkier than I made it sound above, for reasons such as the following:

  • I had already started drafting a post that was related to the first post I ended up actually posting
    • Though I still think the indications of a likely job offer probably brought forward the date I started posted, and increased how many posts I ended up writing around then (shooting for a sequence right away, rather than just one exploratory post)
  • I had also been offered an operations role at a high-status EA org at the same time, which also provided some degree of "stamp of approval"
  • I also had various other "stamp of approval"-ish things around the same time, e.g. from conversations at an EAG and an EAGx
  • I'd set the goal to "get up to speed" to a certain extent, and then started posting things, and if I recall correctly I'd felt that the first part should last me most of 2019 and I indeed felt I'd basically completed that part by the end of 2019. So that probably also caused me to switch into a mode of "alright, let's actually start posting now". 

I realize this will sound crazy, but: 

  • Maybe bad mentors are even more important than good mentors

A good mentor will tell you smart things, you'll follow them, see good results and maybe think, "Wow! I'm so lucky to have a good mentor. I'll ask them about X, Y and Z." This reinforces the mentor-mentee dependency cycle

A bad mentor will tell you stupid things, you'll follow them, see terrible results and hopefully think, "Wow! That mentor was terrible. I'll ask someone else about X, Y and Z." This frees up the bad mentor to "help" others.

 A bad mentor who believes in you, but provides terrible advice is perhaps a Mysterious Old Wizard. A more common situation is a loving, kind parent or wonderful friend who believes in you more than you believe in yourself!

More from Raemon
Curated and popular this week
 ·  · 7m read
 · 
This is a linkpost for a paper I wrote recently, “Endogenous Growth and Excess Variety”, along with a summary. Two schools in growth theory Roughly speaking: In Romer’s (1990) growth model, output per person is interpreted as an economy’s level of “technology”, and the economic growth rate—the growth rate of “real GDP” per person—is proportional to the amount of R&D being done. As Jones (1995) pointed out, populations have grown greatly over the last century, and the proportion of people doing research (and the proportion of GDP spent on research) has grown even more quickly, yet the economic growth rate has not risen. Growth theorists have mainly taken two approaches to reconciling [research] population growth with constant economic growth. “Semi-endogenous” growth models (introduced by Jones (1995)) posit that, as the technological frontier advances, further advances get more difficult. Growth in the number of researchers, and ultimately (if research is not automated) population growth, is therefore necessary to sustain economic growth. “Second-wave endogenous” (I’ll write “SWE”) growth models posit instead that technology grows exponentially with a constant or with a growing population. The idea is that process efficiency—the quantity of a given good producible with given labor and/or capital inputs—grows exponentially with constant research effort, as in a first-wave endogenous model; but when population grows, we develop more goods, leaving research effort per good fixed. (We do this, in the model, because each innovator needs a monopoly on his or her invention in order to compensate for the costs of developing it.) Improvements in process efficiency are called “vertical innovations” and increases in good variety are called “horizontal innovations”. Variety is desirable, so the one-off increase in variety produced by an increase to the population size increases real GDP, but it does not increase the growth rate. Likewise exponential population growth raise
 ·  · 4m read
 · 
Sometimes working on animal issues feels like an uphill battle, with alternative protein losing its trendy status with VCs, corporate campaigns hitting blocks in enforcement and veganism being stuck at the same percentage it's been for decades. However, despite these things I personally am more optimistic about the animal movement than I have ever been (despite following the movement for 10+ years). What gives? At AIM we think a lot about the ingredients of a good charity (talent, funding and idea) and more and more recently I have been thinking about the ingredients of a good movement or ecosystem that I think has a couple of extra ingredients (culture and infrastructure). I think on approximately four-fifths of these prerequisites the animal movement is at all-time highs. And like betting on a charity before it launches, I am far more confident that a movement that has these ingredients will lead to long-term impact than I am relying on, e.g., plant-based proteins trending for climate reasons. Culture The culture of the animal movement in the past has been up and down. It has always been full of highly dedicated people in a way that is rare across other movements, but it also had infighting, ideological purity and a high level of day-to-day drama. Overall this made me a bit cautious about recommending it as a place to spend time even when someone was sold on ending factory farming. But over the last few years professionalization has happened, differences have been put aside to focus on higher goals and the drama overall has gone down a lot. This was perhaps best embodied by my favorite opening talk at a conference ever (AVA 2025) where Wayne and Lewis, leaders with very different historical approaches to helping animals, were able to share lessons, have a friendly debate and drive home the message of how similar our goals really are. This would have been nearly unthinkable decades ago (and in fact resulted in shouting matches when it was attempted). But the cult
 ·  · 4m read
 · 
TLDR When we look across all jobs globally, many of us in the EA community occupy positions that would rank in the 99.9th percentile or higher by our own preferences within jobs that we could plausibly get.[1] Whether you work at an EA-aligned organization, hold a high-impact role elsewhere, or have a well-compensated position which allows you to make significant high effectiveness donations, your job situation is likely extraordinarily fortunate and high impact by global standards. This career conversations week, it's worth reflecting on this and considering how we can make the most of these opportunities. Intro I think job choice is one of the great advantages of development. Before the industrial revolution, nearly everyone had to be a hunter-gatherer or a farmer, and they typically didn’t get a choice between those. Now there is typically some choice in low income countries, and typically a lot of choice in high income countries. This already suggests that having a job in your preferred field puts you in a high percentile of job choice. But for many in the EA community, the situation is even more fortunate. The Mathematics of Job Preference If you work at an EA-aligned organization and that is your top preference, you occupy an extraordinarily rare position. There are perhaps a few thousand such positions globally, out of the world's several billion jobs. Simple division suggests this puts you in roughly the 99.9999th percentile of job preference. Even if you don't work directly for an EA organization but have secured: * A job allowing significant donations * A position with direct positive impact aligned with your values * Work that combines your skills, interests, and preferred location You likely still occupy a position in the 99.9th percentile or higher of global job preference matching. Even without the impact perspective, if you are working in your preferred field and preferred country, that may put you in the 99.9th percentile of job preference