Hide table of contents

As part of 'strategy fortnight' (and in part inspired by this post) I decided to write this short post clarifying the relationship, as I see it,[1] between 80,000 Hours and the EA community. I chose these questions because I thought there might be some people who care about the answers and who would want to know what (someone on the leadership team at) 80,000 Hours would say.

Is 80,000 Hours's mission to build the EA community?

No — our aim is to help people have careers with a lot of social impact. If the EA community didn't exist, we could still pursue our mission.

However, we count ourselves as part of the EA community in part because we think it's pretty great. It has flaws, and we don't blanket recommend getting involved to our readers (a recommendation we feel better about making widely is to get involved in some kind of community that shares your aims). But we think the EA community does do a lot to help people (including us) figure out how to have a big positive impact, think through options carefully, and work together to make projects happen.

For that reason, we do say we think learning about and getting involved in the EA community could be a great idea for many of our readers.

And we think building the EA community can be a way to have a high-impact career, so we list articles focused on it high up on our problem profiles page and among our career reviews.

Both of these are ways in which we do contribute substantially to building the effective altruism community.

We think this is one of the ways we've had a positive impact over the years, so we do continue to put energy into this route to value (more on this below). But doing so is ultimately about helping the individuals we are writing for to increase their ability to have a positive impact by getting involved, rather than to benefit the community per se.

In other words, helping grow the ea community is part of our strategy for pursuing our mission of helping people have high-impact careers.[2]

Does 80,000 Hours seek to provide "career advice for effective altruists"?

Somewhat, but not mostly, and it would feel misleading to put it that way.

80,000 Hours focuses on helping a group much larger than the (current) EA community have higher impact careers. For example, we estimate the size of the group we are trying to reach with the website to be ~100k people — which is around 10x larger than the EA community. (For context, we currently get in the range of 4M visitors to the website a year, and have 300k newsletter subscribers.)

Some of the people in our audience are part of the EA community already, but they're a minority.

One reason we focus so broadly is that we are trying to optimise the marginal counterfactual impact of our efforts. This often translates into trying to focus on people who aren't already heavily involved and so don't have other EA resources to draw on. For someone who hasn't heard of EA or who has heard of it but doesn't know much about it, there is much lower hanging fruit for counterfactually helping them improve the impact of their careers. For example, we can introduce them to well-known-within-EA ideas like the ITN framework and cause selection, or particularly pressing issues like AI safety and biosecurity, as well as the EA community itself. Once someone is involved in EA, they are also more likely and able to take advantage of resources that are less optimised for newer people.

This is not an absolute rule, and it varies by programme – for example, the website tends to focus more (though not exclusively) on 'introductory' materials than the podcast which aims to go more in-depth, and one-on-one advising tries to tailor their discussion to the needs of whoever they're talking to. This also could change with time.

We hope people in the EA community can benefit from some of our advice and programmes, and we welcome their engagement with and feedback on our ideas. But overall, we are not focused on career advice for members of the effective altruism community in particular.[3]

Does 80,000 Hours seek to reflect the views of the EA community?

No – we want our advice to be as true and useful for having a greater impact as it can be, and we use what people in the EA community think as a source of evidence about that (among others). As above, we consider the EA community to be a rich source of advice, thinking, research, and wisdom from people who largely share our values. For example, we tend to be pretty avid readers of the EA forum and other EA resources, and we often seek advice from experts in our network, who often count themselves as part of the EA community.[4]

But to the extent that our best guesses diverge from what the EA community thinks on a topic, we go with our best guesses.

To be clear, these guesses are all-things-considered judgements, attempting to account for empirical and moral uncertainty and the views of subject-matter experts. But, for example, going with our best guesses leads us to prioritise working on existential risk reduction and generally have a more longtermist picture of cause prioritisation than we would have if we were instead trying to reflect the views of the community as a group.[5]

So what is the relationship between 80,000 hours and the EA community?

We consider ourselves part of the EA community, in the same way other orgs and individuals who post to this forum are. We share the primary goal of doing good as effectively as we can and think it's helpful and important to use evidence and careful reasoning to guide us in doing that. We try to use these principles of effective altruism.

The history of 80,000 Hours is also very connected to the history of the broader EA movement. E.g. 80,000 Hours helped popularise career choice as a priority for people part of the EA community, and our founders, staff, and advisors have had a variety of roles in other EA affiliated organisations.

As above, we also often draw on collected research, wisdom, and resources that others in this community generously share – either through channels like the EA forum, or privately. And also as above, we often recommend EA to our readers — as a community to learn from or help build — because we think it's useful and that its animating values are really important.

Moreover, we do think much of our impact is mediated by other members of the EA community. They are a huge source of ongoing support, connections, and dynamic information for helping our readers, listeners, and advisees continue their (80,000-hour-long!) journeys doing good with their careers beyond what we can provide. Introducing people to the community and helping them get more involved might be among the main ways we've had an impact over the years.

We are institutionally embedded. 80,000 Hours, like the Centre for Effective Altruism, Giving What We can, and others, is a project of the Effective Ventures group, and our biggest funder is Open Philanthropy, through their Effective Altruism Community Growth (Longtermism) programme. Our other donors are also often involved in EA.

Overall, we regard the EA community as full of great collaborators and partners in our shared goal of doing good effectively. We are grateful to be part of the exchange of important ideas, research, infrastructure, feedback, learning experiences, and financial support from other community members.

 

  1. ^

    This post was reviewed by 80,000 Hours CEO Brenton Mayer and the other programme directors Michelle Hutchinson and Niel Bowerman, as well as 80,000 Hours writer Cody Fenwick, before publishing.

  2. ^

    Related: "cause first" and "member first" approaches to community building — this post suggests that 80,000 Hours 'leans cause first' – focusing on building the EA community as a way of getting more people contributing to solving (or prioritising, or otherwise building capacity to solve) pressing global problems. I don't agree with everything in that post with regards to the shape of the distinction (e.g. I think a 'cause first' approach defined in the way I just did should often centre principles and encourage people to think for themselves), but I agree with the basic classification of 80,000 Hours there.

  3. ^

    This feels like a good place to note that we are supportive of other career-advice-focused organisations cropping up (like Probably Good and Successif), and it also seems great for individuals to post their takes on career advice (like Holden's career advice for longtermists and Richard Ngo's AI safety career advice) - not only does this produce content more aimed at audiences that 80,000 Hours' content doesn't currently focus as much on (like current members of the EA community), there is just generally room for lots of voices in this space (and if there isn't room, competition seems healthy.)

  4. ^

    Also, like other members of the community, we are informally influenced in all kinds of ways by thinking in EA – we come across books and thinkers more often if they are recommended by community members; we are socially connected to people who share ideas, etc. These are ways in which we might be too influenced by what the EA community thinks — it's harder to find and keep salient ideas from others, so we have to put more work into it.

  5. ^

    A lot of these questions are really hard, and we're very far from certain we have the right answers — others in or outside the EA community could be more right than we are on cause prioritisation or other questions. Also, a lot of our advice, like our career guide and career planning materials, are more designed to be useful to people regardless of which issues are most pressing or which careers are highest impact.

70

0
0

Reactions

0
0

More posts like this

Comments5
Sorted by Click to highlight new comments since:

We hope people in the EA community can benefit from some of our advice and programmes, and we welcome their engagement with and feedback on our ideas. But overall, we are not focused on career advice for members of the effective altruism community in particular.

 

This seems like it could mean different things:

  1. "The 80k advice is meant to be great for a broad audience, which includes, among others, EAs. If we'd focus specifically on EAs it would be even better, but EAs are our target audience like anyone else is", or
  2. "The 80k advice is targeted at non-EAs. EAs might get some above-zero value from it, or they might give useful comments, and we don't want to tell EAs not to read 80k, but we know it is often probably bad-fit advice for EAs. For example, we talk a lot about things EAs already know, and we only mention in brief things that EAs should consider in length."
    1. Or even, ".. and we push people towards direction X while most EAs should probably be pushed towards NOT-X. For example, most non-EAs should think about how they could be having more impact, but most EAs should stop worrying about that so much because it's breaking them and they're already having a huge impact"

Could you clarify what you mean?

This feels fairly tricky to me actually -- I think between the two options presented I'd go with (1) (except I'm not sure what you mean by "If we'd focus specifically on EAs it would be even better" -- I do overall endorse our current choice of not focusing specifically on EAs).

However, some aspects of (2) seem right too. For example, I do think that we talk about a lot of things EAs already know about in much of our content (though not all of it). And I think some of the "here's why it makes sense to focus on impact" - type content does fall into that category (though I don't think it's harmful for EAs to consume that, just not paritcularly useful).

The way I'd explain it:

Our audience does include EAs. But there are a lot of different sub-audiences within the audience. Some of our content won't be good for some of those sub-audiences. We also often prioritise the non-EA sub-audiences over the EA sub-audience when thinking about what to write. I'd say that the website currently does this the majority of the time. but sometimes we do the reverse.

We try to produce different content that is aimed primarily at different sub-audiences, but which we hope will still be accessible to the rest of the target audience. So for example, our career guide is mostly aimed at people who aren't currently EAs, but we want it to be at-all useful for EAs. Conversely, some of our content -- like this post on whether or not to take capabilities-enhancing roles if you want to help with AI safety (https://80000hours.org/articles/ai-capabilities/), and to a lesser extent our career reviews -- are "further down our funnel" and so might be a better fit for EAs; but we also want those to be accessible to non-EAs and put work into making that the case.

This trickiness is a downside of having a broad target audience that includes different sub-audiences.

I guess if the question is "do I think EAs should ever read any of our content" I'd say yes. If the question is "do I think all of our content is a good fit for EAs" I'd say no. If the question is "do I think any of our content is harmful for EAs to read" I'd say "overall no" though there are some cases of people (EAs and non-EAs) being negatively affected by our content (e.g. finding it demoralising).

Thanks

I was specifically thinking about career guides (and I'm most interested in software, personally).

 

(I'm embarrassed to say I forgot 80k has lots of other material too, especially since I keep sharing that other-material with my friends and referencing it as a trusted source. For example, you're my go-to source about climate. So totally oops for forgetting all that, and +1 for writing it and having it relevant for me too)

Have the answers to these questions changed over the years ? E.g. how might you have answered them in 2017 or 2015?

I don't know the answer to this, because I've only been working at 80k since 2019 - but my impression is this isn't radically different from what might have been written in those years.

Curated and popular this week
Relevant opportunities