Are we too cocky with EA funding or EA jobs; should EAs prepare for economic instability?
EA feels flush with cash, jobs, and new projects. But we have mostly “grown up” as a movement after the Great Recession of 2008 and may not be prepared for economic instability.
Many EAs come from very economically and professionally stable families. Our donor base may be insulated from economic shocks but not all orgs or individuals will be in equally secure positions.
I think lower- to -middle performers or newer EAs may overestimate stability and be overly optimistic about their stability and opportunities for future funding.
If that’s true, what should we be doing differently?
I've definitely thought about this. EA is a relatively young movement. Its momentum is massive at the moment, but even so, creating a career out of something like EA community building is far from certain, even for people who can reasonably easily secure funding for a few months or years.
I think that a good thing to do would be to ask "What would happen if EA ceased to exist in ten years?" when making career plans. If the answer is "Well, I would have been better off had I sought traditional career capital, but I think I'll land on my feet anyway" that's a fine answer - it would be unreasonable to expect that devoting years of your life to a niche movement has zero costs. If the answer is "I'd be completely screwed, I have no useful skills outside of this ecosystem and would still need to work for a living", I would be more concerned and suggest people alter plans accordingly.
That said, I think for many or most EA's, this will not be the case. Many EA cause areas require highly valuable skills such as software engineering, research ability, or operations/management skills that can be useful in the private or public sector outside of effective altruism. I also feel like this mainly applies to very early-career individuals. For instance, I have a few years of SWE experience and want to move into AI safety. If EA disbanded in ten years...well, I'd still want to work on the problem, but what if we solved the alignment problem or proved it actually wasn't a major cause area somehow? And then EA said "Okay, thanks for all your hard work, but we don't really need AI alignment experts any more". I would be okay - I could go back to SWE work. I'd be worse off than if I spent ten years working for strong non-EA tech companies, but I would hardly be destitute.
It's not that hard to have a backup plan in place, but we should encourage people to have one. This may also help with mental health - leaving a line of retreat from EA should it be too overwhelming for some people is useful, and you don't have much of a line of retreat if you're dependent on EA for income.
I agree that for a lot of people, this won’t be a problem. A lot of EA roles are professionalizing, so people can switch over to traditional careers if they want. (As in, community building is enough like management, event planning, or outreach roles at a lot of traditional orgs that the skills may transfer).
One piece of good advice for most people:
Issue-specific expertise and professional networks don’t transfer well. I’d advise that a good backup plan should include spending time networking with EA-adjacent, and non-EA orgs.
That issue seems inconvenient, but can be overcome with time and planning.
The main caution I want to raise is this:
it’s not always possible for EAs to leave themselves a psychological line of retreat into non-EA roles. Anecdote below to illustrate.
Suppose someone is currently reasonably happy, productive, and has a support system in their non-EA role and social scene. They’re not sure they’ll make it in EA.
Before switching to EA work, it might be worth considering this risk:
suppose doing an EA role for a while fails, and
you can’t get another EA role
and this results in you being miserable and ineffective in almost all subsequent non-EA roles? Is that worth it?
If you’re miserable at work for 1-5+ years or more and have a hard time relating to friends, family, and both EA and non-EA peers during that time, do you have reason to believe your mental health and finances are solid enough to recover, or do you have mental health risks or other risk factors that make that a pretty dangerous bet?
I didn’t know I was taking that bet. It caught me very off guard. It seems so costly to have played this game that it would have been better to not work on EA projects, and to keep my EA participation more casual instead.
Solution?
I don’t think we have one yet. I don’t know where in the funnel I could have best been diverted, or how I could have best been supported when I tried to transition back to non-EA work.
I imagine EAs will get better at this over time.
Personal Anecdote:
Pre- EA, I didn’t enjoy work if it had a low likelihood of a positive outcome/impact. That motivated me to find the most useful things I could do in whichever role I was in. I enjoyed that and was effective at it. That also led me to EA.
After a deep dive into EA projects and social scenes, my definition of impact changed. I was always aware that many “good things” might not actually do good. But the set of things that I saw as plausibly high impact got much smaller. The bar for “having an impact” got much higher. I endorse that.
After a few EA projects, I found I don’t have the right aptitudes for most EA work. This was humbling but mostly fine. I figured I’d just get a non-EA role like the ones I’d had before, and go back to making them a little bit more effective.
When I tried to take this line of retreat though, I found I couldn’t. I hadn’t fully appreciated how hard I would hit a wall when trying to do work that no longer met my bar for impact. All the professional roles I had and was previously reasonably happy and effective in were now below the bar. It seems my brain doesn’t just prefer impact; I found myself fundamentally incapable of work on something full-time when my brain couldn’t see the case for impact.
This seemed stupid; surely I should just be able to muscle through and do a traditional job while I skill up in something else and try to move on to something that’s above my impact bar, right?
I could not. I tried, but depression and anxiety set in fast without the connection to impact, which decreased performance, which increased depression and anxiety. (Fun spiral, mate /s).
I’d need to quit and try again elsewhere. Same story repeated. (This didn’t happen pre-EA).
The instability means EtG isn’t really available as a path either, and I’m not building a strong resume. I became a more frustrated and frustrating person, decreasing the quality of my work and my relationships. This means I don’t have the right mindset anymore to use my aptitudes in non-EA roles either! I’m not sure this is changeable. A lot of the negative impacts are irreversible at this point.
I know at least one other EA in a similar boat. Maybe there are more, or maybe I’m a rare kind of bycatch. I’m not sure if I expect more or fewer cases like mine as EA grows.
Note: I’m not proud of nor endorsing my mindset. I feel a bit stupid for feeling this way and for sharing it. I’ve read Julia Wise pieces about how it’s ok to leave EA being ok and how it’s fine to have more than one goal. I agree, but not at a deep enough level yet to alter my experience at work.
Are we too cocky with EA funding or EA jobs; should EAs prepare for economic instability?
EA feels flush with cash, jobs, and new projects. But we have mostly “grown up” as a movement after the Great Recession of 2008 and may not be prepared for economic instability.
Many EAs come from very economically and professionally stable families. Our donor base may be insulated from economic shocks but not all orgs or individuals will be in equally secure positions.
I think lower- to -middle performers or newer EAs may overestimate stability and be overly optimistic about their stability and opportunities for future funding.
If that’s true, what should we be doing differently?
I've definitely thought about this. EA is a relatively young movement. Its momentum is massive at the moment, but even so, creating a career out of something like EA community building is far from certain, even for people who can reasonably easily secure funding for a few months or years.
I think that a good thing to do would be to ask "What would happen if EA ceased to exist in ten years?" when making career plans. If the answer is "Well, I would have been better off had I sought traditional career capital, but I think I'll land on my feet anyway" that's a fine answer - it would be unreasonable to expect that devoting years of your life to a niche movement has zero costs. If the answer is "I'd be completely screwed, I have no useful skills outside of this ecosystem and would still need to work for a living", I would be more concerned and suggest people alter plans accordingly.
That said, I think for many or most EA's, this will not be the case. Many EA cause areas require highly valuable skills such as software engineering, research ability, or operations/management skills that can be useful in the private or public sector outside of effective altruism. I also feel like this mainly applies to very early-career individuals. For instance, I have a few years of SWE experience and want to move into AI safety. If EA disbanded in ten years...well, I'd still want to work on the problem, but what if we solved the alignment problem or proved it actually wasn't a major cause area somehow? And then EA said "Okay, thanks for all your hard work, but we don't really need AI alignment experts any more". I would be okay - I could go back to SWE work. I'd be worse off than if I spent ten years working for strong non-EA tech companies, but I would hardly be destitute.
It's not that hard to have a backup plan in place, but we should encourage people to have one. This may also help with mental health - leaving a line of retreat from EA should it be too overwhelming for some people is useful, and you don't have much of a line of retreat if you're dependent on EA for income.
I agree that for a lot of people, this won’t be a problem. A lot of EA roles are professionalizing, so people can switch over to traditional careers if they want. (As in, community building is enough like management, event planning, or outreach roles at a lot of traditional orgs that the skills may transfer).
One piece of good advice for most people:
That issue seems inconvenient, but can be overcome with time and planning.
The main caution I want to raise is this:
Suppose someone is currently reasonably happy, productive, and has a support system in their non-EA role and social scene. They’re not sure they’ll make it in EA.
Before switching to EA work, it might be worth considering this risk:
I didn’t know I was taking that bet. It caught me very off guard. It seems so costly to have played this game that it would have been better to not work on EA projects, and to keep my EA participation more casual instead.
Solution? I don’t think we have one yet. I don’t know where in the funnel I could have best been diverted, or how I could have best been supported when I tried to transition back to non-EA work. I imagine EAs will get better at this over time.
Personal Anecdote: Pre- EA, I didn’t enjoy work if it had a low likelihood of a positive outcome/impact. That motivated me to find the most useful things I could do in whichever role I was in. I enjoyed that and was effective at it. That also led me to EA.
After a deep dive into EA projects and social scenes, my definition of impact changed. I was always aware that many “good things” might not actually do good. But the set of things that I saw as plausibly high impact got much smaller. The bar for “having an impact” got much higher. I endorse that.
After a few EA projects, I found I don’t have the right aptitudes for most EA work. This was humbling but mostly fine. I figured I’d just get a non-EA role like the ones I’d had before, and go back to making them a little bit more effective.
When I tried to take this line of retreat though, I found I couldn’t. I hadn’t fully appreciated how hard I would hit a wall when trying to do work that no longer met my bar for impact. All the professional roles I had and was previously reasonably happy and effective in were now below the bar. It seems my brain doesn’t just prefer impact; I found myself fundamentally incapable of work on something full-time when my brain couldn’t see the case for impact. This seemed stupid; surely I should just be able to muscle through and do a traditional job while I skill up in something else and try to move on to something that’s above my impact bar, right? I could not. I tried, but depression and anxiety set in fast without the connection to impact, which decreased performance, which increased depression and anxiety. (Fun spiral, mate /s). I’d need to quit and try again elsewhere. Same story repeated. (This didn’t happen pre-EA).
The instability means EtG isn’t really available as a path either, and I’m not building a strong resume. I became a more frustrated and frustrating person, decreasing the quality of my work and my relationships. This means I don’t have the right mindset anymore to use my aptitudes in non-EA roles either! I’m not sure this is changeable. A lot of the negative impacts are irreversible at this point.
I know at least one other EA in a similar boat. Maybe there are more, or maybe I’m a rare kind of bycatch. I’m not sure if I expect more or fewer cases like mine as EA grows.
Note: I’m not proud of nor endorsing my mindset. I feel a bit stupid for feeling this way and for sharing it. I’ve read Julia Wise pieces about how it’s ok to leave EA being ok and how it’s fine to have more than one goal. I agree, but not at a deep enough level yet to alter my experience at work.