EA thinking is thinking on the margin. When EAs prioritise causes, they are prioritising causes given the fact that they only control their one career, or, sometimes, given that they have some influence over a community of a few thousand people, and the distribution of some millions or billions of dollars.
Some critiques of EA act as if statements about cause prioritisation are absolute rather than relative. I.e. that EAs are saying that literally everyone should be working on AI Safety, or, the flipside, that EAs are saying that no one should be working on [insert a problem which is pressing, but not among the most urgent to commit the next million dollars to].
In conversations that sound like this, I've often turned to the idea that if EAs controlled all the resources in the world, career advisors at the hypothetical world government's version of 80,000 Hours would be advising some people to be... postal workers. Given that the EA world government will have long ago filled the current areas of direct EA work, it could be the single most impactful thing a person could do with their skillset, given the comparative neglectedness of work in the postal service.
In this world some people would also be told that the best thing they could do is to work on [insert a problem which is pressing, but not among the most urgent to commit the next million dollars to in our current world].
It's basically just a fun thought experiment to make the point that EAs are not advising the whole world's resources, and if they were, they wouldn't (and shouldn't) argue for neglecting everything except for the current top EA causes.

I like the main point you're making.
However, I think "the government's version of 80,000 Hours" is a very command-economy vision. Command economies have a terrible track record, and if there were such a thing as an "EA world government" (which I would have many questions about regardless) I would strongly think it shouldn't try to plan and direct everyone's individual careers, and should instead leverage market forces like ~all successful large economies.
Lol yep that's fair. This is surprisingly never the direction the conversation has gone after I've shared this thought experiment.
Maybe it should be more like: in a world where resources are allocated according to EA priorities (allocation method- silent), 80,000 Hours would be likelier to tell someone to be a post officer than an AI safety researcher... Bit less catchy though.
That's a catchy tagline, I might have to start using it :) Thanks!
I like this. It makes me think of how these people working in typical EA jobs wouldn't be doing much at all without the people working in food, water, health, transportation, and other stuff that makes life livable, work workable; and if the number of those support workers suddenly fell to the number of direct workers, EA recruiters would focus on nothing but restoring the number of farmers, bus drivers, and healthy-world doctors. 💌