The recent pivot by 80 000 hours to focus on AI seems (potentially) justified, but the lack of transparency and input makes me feel wary.
https://forum.effectivealtruism.org/posts/4ZE3pfwDKqRRNRggL/80-000-hours-is-shifting-its-strategic-approach-to-focus
TLDR;
80 000 hours, a once cause-agnostic broad scope introductory resource (with career guides, career coaching, online blogs, podcasts) has decided to focus on upskilling and producing content focused on AGI risk, AI alignment and an AI-transformed world.
----------------------------------------
According to their post, they will still host the backlog of content on non AGI causes, but may not promote or feature it. They also say a rough 80% of new podcasts and content will be AGI focused, and other cause areas such as Nuclear Risk and Biosecurity may have to be scoped by other organisations.
Whilst I cannot claim to have in depth knowledge of robust norms in such shifts, or in AI specifically, I would set aside the actual claims for the shift, and instead focus on the potential friction in how the change was communicated.
To my knowledge, (please correct me), no public information or consultation was made beforehand, and I had no prewarning of this change. Organisations such as 80 000 hours may not owe this amount of openness, but since it is a value heavily emphasises in EA, it seems slightly alienating.
Furthermore, the actual change may not be so dramatic, but it has left me grappling with the thought that other mass organisations could just as quickly pivot. This isn't necessarily inherently bad, and has advantageous signalling of being 'with the times' and 'putting our money where our mouth is' in terms of cause area risks. However, in an evidence based framework, surely at least some heads up would go a long way in reducing short-term confusion or gaps.
Many introductory programs and fellowships utilise 80k resources, and sometimes as embeds rather than as standalone resources. Despite claimi
A quick annoucement that Magnify mentee applications are now open!
Magnify mentee applications are currently open!
We would love to hear from you if you are a woman, non-binary person, or trans person of any gender who is enthusiastic about pursuing a high-impact career using evidence-based approaches. Please apply here by the 18th March.
Past mentees have been most successful when they have a clear sense of what they would like to achieve through the 6-month mentorship program. We look to match pairings based on the needs and availability of the mentee and mentor, their goals, career paths, and what skills they are looking to develop.
On average, mentees and mentors meet once a month for 60-90 minutes with a series of optional prompt questions prepared by our team. In the post-round feedback form, the average for “I recommend being a Magnify mentee” was 9.28/10 in Round 3 and 9.4/10 in Round 4. You can see testimonies from some of our mentees here, here and here. Some reported outcomes for mentees were:
* Advice, guidance, and resources on achieving goals.
* Connection and support in pursuing opportunities (jobs, funding).
* Confidence-building.
* Specific guidance (How to network? How to write a good resume?).
* Joining a welcoming community for support through challenges.
If you have any questions, please do not hesitate to get in touch with Kathryn at <kathryn@magnifymentoring.org>.
I'm going to be leaving 80,000 Hours and joining Charity Entrepreneurship's incubator programme this summer!
The summer 2023 incubator round is focused on biosecurity and scalable global health charities and I'm really excited to see what's the best fit for me and hopefully launch a new charity. The ideas that the research team have written up look really exciting and I'm trepidatious about the challenge of being a founder but psyched for getting started. Watch this space! <3
I've been at 80,000 Hours for the last 3 years. I'm very proud of the 800+ advising calls I did and feel very privileged I got to talk to so many people and try and help them along their careers!
I've learned so much during my time at 80k. And the team at 80k has been wonderful to work with - so thoughtful, committed to working out what is the right thing to do, kind, and fun - I'll for sure be sad to leave them.
There are a few main reasons why I'm leaving now:
1. New career challenge - I want to try out something that stretches my skills beyond what I've done before. I think I could be a good fit for being a founder and running something big and complicated and valuable that wouldn't exist without me - I'd like to give it a try sooner rather than later.
2. Post-EA crises stepping away from EA community building a bit - Events over the last few months in EA made me re-evaluate how valuable I think the EA community and EA community building are as well as re-evaluate my personal relationship with EA. I haven't gone to the last few EAGs and switched my work away from doing advising calls for the last few months, while processing all this. I have been somewhat sad that there hasn't been more discussion and changes by now though I have been glad to see more EA leaders share things more recently (e.g. this from Ben Todd). I do still believe there are some really important ideas that EA prioritises but I'm more circumspect about some of the things I think we're not doing as well as we could (
I'm currently facing a career choice between a role working on AI safety directly and a role at 80,000 Hours. I don't want to go into the details too much publicly, but one really key component is how to think about the basic leverage argument in favour of 80k. This is the claim that's like: well, in fact I heard about the AIS job from 80k. If I ensure even two (additional) people hear about AIS jobs by working at 80k, isn't it possible going to 80k could be even better for AIS than doing the job could be?
In that form, the argument is naive and implausible. But I don't think I know what the "sophisticated" argument that replaces it is. Here are some thoughts:
* Working in AIS also promotes growth of AIS. It would be a mistake to only consider the second-order effects of a job when you're forced to by the lack of first-order effects.
* OK, but focusing on org growth fulltime seems surely better for org growth than having it be a side effect of the main thing you're doing.
* One way to think about this is to compare two strategies of improving talent at a target org, between "try to find people to move them into roles in the org, as part of cultivating a whole overall talent pipeline into the org and related orgs", and "put all of your fulltime effort into having a single person, i.e. you, do a job at the org". It seems pretty easy to imagine that the former would be a better strategy?
* I think this is the same intuition that makes pyramid schemes seem appealing (something like: surely I can recruit at least 2 people into the scheme, and surely they can recruit more people, and surely the norm is actually that you recruit a tonne of people" and it's really only by looking at the mathematics of the population as a whole you can see that it can't possibly work, and that actually it's necessarily the case that most people in the scheme will recruit exactly zero people ever.
* Maybe a pyramid scheme is the extreme of "what if literally everyone in EA work
GET AMBITIOUS SLOWLY
Most approaches to increasing agency and ambition focus on telling people to dream big and not be intimidated by large projects. I'm sure that works for some people, but it feels really flat for me, and I consider myself one of the lucky ones. The worst case scenario is big inspiring speeches get you really pumped up to Solve Big Problems but you lack the tools to meaningfully follow up.
Faced with big dreams but unclear ability to enact them, people have a few options.
* try anyway and fail badly, probably too badly for it to even be an educational failure.
* fake it, probably without knowing they're doing so
* learned helplessness, possible systemic depression
* be heading towards failure, but too many people are counting on you so someone steps in and rescue you. They consider this net negative and prefer the world where you'd never started to the one where they had to rescue you.
* discover more skills than they knew. feel great, accomplish great things, learn a lot.
The first three are all very costly, especially if you repeat the cycle a few times.
My preferred version is ambition snowball or "get ambitious slowly". Pick something big enough to feel challenging but not much more, accomplish it, and then use the skills and confidence you learn to tackle a marginally bigger challenge. This takes longer than immediately going for the brass ring and succeeding on the first try, but I claim it is ultimately faster and has higher EV than repeated failures.
I claim EA's emphasis on doing The Most Important Thing pushed people into premature ambition and everyone is poorer for it. Certainly I would have been better off hearing this 10 years ago
What size of challenge is the right size? I've thought about this a lot and don't have a great answer. You can see how things feel in your gut, or compare to past projects. My few rules:
* stick to problems where failure will at least be informative. If you can't track reality well eno
USA has ~85k annual mowing injury ER visits:
https://pubmed.ncbi.nlm.nih.gov/29395756/
~44% of which are fractures and amputation:
https://pubmed.ncbi.nlm.nih.gov/30067452/
Lawncare's also ~5% of USA pollution:
https://www.bloomberg.com/opinion/articles/2021-05-21/lawn-mowers-are-the-next-electric-frontier
https://www.epa.gov/sites/default/files/2015-09/documents/banks.pdf
Autonomous mowing robots eliminate most of mowing's danger, pollution, labor cost/time, and noise
EA hiring gets a lot of criticism. But I think there are aspects at which it does unusually well.
One thing I like is that hiring and holding jobs feels way more collaborative between boss and employee. I'm much more likely to feel like a hiring manager wants to give me honest information and make the best decision, whether or not that's with them.Relative to the rest of the world they're much less likely to take investigating other options personally.
Work trials and even trial tasks have a high time cost, and are disruptive to people with normal amounts of free time and work constraints (e.g. not having a boss who wants you to trial with other orgs because they personally care about you doing the best thing, whether or not it's with them). But trials are so much more informative than interviews, I can't imagine hiring for or accepting a long-term job without one.
Trials are most useful when you have the least information about someone, so I expect removing them to lead to more inner-ring dynamics and less hiring of unconnected people.
EA also has an admirable norm of paying for trials, which no one does for interviews.
I often return to this bit of 80000 Hours' anonymous career advice, about how when you're great at your job, no one's advice is that useful.
I like it a lot. It reminds me of Agnes Callard's observation about a young writer asking Margaret Atwood for advice and getting only the trite advice to "write every day":