Hide table of contents

I'm thinking of creating a group for EAs applying for jobs. If there already is one, please say where it is.

The vision would be a place where people can encourage one another when they get rejections, point to resources, read each others applications etc. 

So imagine such a group existed. Where would you expect to find it? Then I'll set one up there. 

13

0
0

Reactions

0
0
New Answer
New Comment


6 Answers sorted by

An independent discord

This is an invite which won't expire https://discord.gg/BEFBqX6Qap

I made a discord. Anyone can join here. https://discord.gg/aYS7Hb2s 

1
maximumpeaches
That invite link no longer works. Can you share steps on how to join? Thanks.
1
Nathan Young
Sorry. https://discord.gg/BEFBqX6Qap

I like that kind of platform, but Slack seems more widely used by EAs compared to Discord. If this gets the most votes, I'd consider checking whether people actually prefer discord or instead rather use Slack. 

2
Nathan Young
I think slack has higher barriers to entry. I'm going to try a discord and if it gets no takeup I can try something else. 

Maybe the existing EA Career Discussion Facebook group would be a good place to start such peer support groups, and then they could create their own facebook groups / Slack / etc?

1
Nathan Young
I think anyone is welcome to create their own facebook equivalent but I am going to make a discord.

Slack (EA Anywhere, EA Global or new Slack)

I think you have better chances of success if you use exisiting suitable platforms instead of setting up new ones. 

EA Anywhere seems the best fit for that, so I'd recommend to check with Marisa (EA Anywhere organiser) whether she thinks their Slack would be a good place for this. 

1
Nathan Young
Sure but you only have a better chance among the people who are already on the slack. EA anywhere has 176 members. That doesn't seem like that much additional benefit.

As part of another EA discord

Did you end up creating the Discord server? I tried to follow the invite link you posted here, but it didn't work. I would like to join if possible.

An open WhatsApp group pointed at by a forum post

Whatsapp does not have a "thread-function", so I'd expect this to get chaotic rather quickly. 

Comments3
Sorted by Click to highlight new comments since:

I think this is a good idea, especially if the person managing the group is really good at community building and encouragement. I think this works better if it is a Slack or Discord, so that there can be multiple channels for multiple groups to be created, such as those based on causes or career path interests, or forming cohorts or accountability partners between people. I think the ideal would be groups of ~3-6 people encouraging each other and being transparent about their EA job applications. Those groups can then transition to Facebook or WhatsApp groups if they prefer that.

They could probably have one social call to meet each other and say where they plan on applying, then they could just catch up weekly via an update, and then they could have a call to catch up every month or two.

I think it's worthwhile, and lots of local groups do that already (like EA Berlin). 

Maybe local groups can do most of that as they have the benefit of people being able to meet in person (after corona) and sharing similar culture, job prospects etc, and then everyone who does not have a local group nearby can use EA Anywhere?

Would also recommend asking on the EA Groups Slack. Are you on there yet? The invite link should be somewhere on www.eahub.org, but let me know if you can't find it. 

[comment deleted]1
0
0
Curated and popular this week
 ·  · 20m read
 · 
Once we expand to other star systems, we may begin a self-propagating expansion of human civilisation throughout the galaxy. However, there are existential risks potentially capable of destroying a galactic civilisation, like self-replicating machines, strange matter, and vacuum decay. Without an extremely widespread and effective governance system, the eventual creation of a galaxy-ending x-risk seems almost inevitable due to cumulative chances of initiation over time across numerous independent actors. So galactic x-risks may severely limit the total potential value that human civilisation can attain in the long-term future. The requirements for a governance system to prevent galactic x-risks are extremely demanding, and they need it needs to be in place before interstellar colonisation is initiated.  Introduction I recently came across a series of posts from nearly a decade ago, starting with a post by George Dvorsky in io9 called “12 Ways Humanity Could Destroy the Entire Solar System”. It’s a fun post discussing stellar engineering disasters, the potential dangers of warp drives and wormholes, and the delicacy of orbital dynamics.  Anders Sandberg responded to the post on his blog and assessed whether these solar system disasters represented a potential Great Filter to explain the Fermi Paradox, which they did not[1]. However, x-risks to solar system-wide civilisations were certainly possible. Charlie Stross then made a post where he suggested that some of these x-risks could destroy a galactic civilisation too, most notably griefers (von Neumann probes). The fact that it only takes one colony among many to create griefers means that the dispersion and huge population of galactic civilisations[2] may actually be a disadvantage in x-risk mitigation.  In addition to getting through this current period of high x-risk, we should aim to create a civilisation that is able to withstand x-risks for as long as possible so that as much of the value[3] of the univers
 ·  · 7m read
 · 
Tl;dr: In this post, I introduce a concept I call surface area for serendipity — the informal, behind-the-scenes work that makes it easier for others to notice, trust, and collaborate with you. In a job market where some EA and animal advocacy roles attract over 1,300 applicants, relying on traditional applications alone is unlikely to land you a role. This post offers a tactical roadmap to the hidden layer of hiring: small, often unpaid but high-leverage actions that build visibility and trust before a job ever opens. The general principle is simple: show up consistently where your future collaborators or employers hang out — and let your strengths be visible. Done well, this increases your chances of being invited, remembered, or hired — long before you ever apply. Acknowledgements: Thanks to Kevin Xia for your valuable feedback and suggestions, and Toby Tremlett for offering general feedback and encouragement. All mistakes are my own. Why I Wrote This Many community members have voiced their frustration because they have applied for many jobs and have got nowhere. Over the last few years, I’ve had hundreds of conversations with people trying to break into farmed animal advocacy or EA-aligned roles. When I ask whether they’re doing any networking or community engagement, they often shyly say “not really.” What I’ve noticed is that people tend to focus heavily on formal job ads. This makes sense, job ads are common, straightforward and predictable. However, the odds are stacked against them (sometimes 1,300:1 — see this recent Anima hiring round), and they tend to pay too little attention to the unofficial work — the small, informal, often unpaid actions that build trust and relationships long before a job is posted. This post is my attempt to name and explain that hidden layer of how hiring often happens, and to offer a more proactive, human, and strategic path into the work that matters. This isn’t a new idea, but I’ve noticed it’s still rarely discussed o
 ·  · 2m read
 · 
Is now the time to add to RP’s great work?     Rethink’s Moral weights project (MWP) is immense and influential. Their work is the most cited “EA” paper written in the last 3 years by a mile - I struggle to think of another that comes close. Almost every animal welfare related post on the forum quotes the MWP headline numbers - usually not as gospel truth, but with confidence. Their numbers carry moral weight[1] moving hearts, minds and money towards animals. To oversimplify, if their numbers are ballpark correct then... 1. Farmed animal welfare interventions outcompete human welfare interventions for cost-effectiveness under most moral positions.[2] 2.  Smaller animal welfare interventions outcompete larger animal welfare if you aren’t risk averse. There are downsides in over-indexing on one research project for too long, especially considering a question this important. The MWP was groundbreaking, and I hope it provides fertile soil for other work to sprout with new approaches and insights. Although the concept of “replicability”  isn't quite as relevant here as with empirical research, I think its important to have multiple attempts at questions this important. Given the strength of the original work, any new work might be lower quality - but perhaps we can live with that. Most people would agree that more deep work needs to happen here at some stage, but the question might be is now the right time to intentionally invest in more?   Arguments against more Moral Weights work 1. It might cost more money than it will add value 2. New researchers are likely to land land on a similar approaches and numbers to RP so what's the point?[3] 3. RP’s work is as good as we are likely to get, why try again and get a probably worse product? 4. We don’t have enough new scientific information since the original project to meaningfully add to the work. 5. So little money goes to animal welfare work  now anyway, we might do more harm than good at least in the short t