Patrick Gruban 🔸

COO @ Successif
2479 karmaJoined Working (15+ years)80538 München-Altstadt-Lehel, Deutschland

Bio

Participation
5

COO Successif, Trustee Effective Ventures UK, and member of the Talos Network board

Co-Director EA Germany 2023/24, entrepreneur for over 25 years, member of the EA Munich organizer team since 2020.

How others can help me

Let me know if you have ideas for Successif

How I can help others

I can offer to mentor and be a sounding board if you are an EA-aligned non-profit entrepreneur

Comments
137

Topic contributions
5

I guess I see so many EAs with alignment and general EA knowledge, but lacking these kinds of general professional skills, and I really want these people (who have devoted so much time and effort to EA) to have better professional competencies.

I very much agree with that. When people with no/little professional experience ask me about getting into impactful work outside research, my default advice is to upskill outside impact orgs for a few years and then see how they can apply this experience later. Sometimes I fear that organizations in our space contribute to the problem by hiring more on the basis of value alignment than professional skills, with hiring managers sometimes not even aware of what the strongest candidate for a role could look like, as they don't have experience with this.

This ultimately goes up to management, where I'm surprised to see few org founders hiring experienced CEOs and stepping into roles they are better suited to (Chief Strategist, Chief Researcher, Chief Policymaker, etc.). When I started my first startup straight out of school, this is what we did, and that enabled us to grow the org to over 100 people quickly. I would have been out of my depth at that time to hire the kind of middle management orgs need at that size.

That being said, at RAISEimpact, we help org leaders with hiring strategy and thinking about team composition and culture, so hopefully we can help in this way.

I keep thinking that we have plenty of people involved in EA who are onboard with the general ideas and who want to contribute, but who lack specific skills. Is this a good thing?

It seems there are two ways to solve this: Upskill motivated people or help skilled people stay motivated throughout the career transition. While there are enough resources outside impact areas to help people upskill and more traditional jobs, an impactful talent org can better support the latter. 

When people are surprised that you would need motivation to switch jobs without upskilling, I like to point them to Jim Chapman's post, who went through dozens of applications, rejections, and several programs as an experienced professional. This is a pretty typical journey that we see at Successif as we have written about here. Programs like HIP and the CEA Bootcamp also support this (similar the School for Moral Ambition).

A couple of years ago, almost all programs in this space were for people who still needed upskilling, and I think this led organizations to sometimes hire people who didn't have the necessary skills and who had to learn on the job. Having experienced professionals join seems better to me for most orgs.

Are there specific skills that you see that regular training programs don't do well and might be better suited to offer specifically for our ecosystem?

@Yonatan Cale posted a demo last week of an app he’s building in the EAG Bay Area Slack.

I agree and argued in a similar direction in a comment last year.

As an employer, I would not want to rely on an employee taking a below-market salary. Otherwise, I might be incentivised to keep someone on the team even if they are underperforming, undermining the work of other team members. I would want to hire for talent first and leave salary discussion for the last step, to avoid bias.

That being said, there might be good cases for early-stage orgs that might otherwise not be able to hire, or for positions that might open because of the low salary requirements. At Successif, we recommend that job-seekers use informational interviews with potential employers to explore these kinds of new roles. 

My short Claude prompt was only intended as a conversation starter, so I'm happy this worked. I'm not considering investing, but if potential investors would like to carry this on and share here, this might be useful.

I agree and would encourage potential investors to take into consideration base rates of startups reaching €1M+ on profits yearly when comparing this to other forms of investments. I spent 5 min prompting Claude to come up with a BOTEC based on this post, which I haven't checked but could be an entry point to additional research.

The board members of EA Germany are elected by the members (over 100) of the organization. The board is responsible for hiring the director.

What you write aligns with the challenges we see from our advisees, and based on your profile, you may be a good fit for our career advising program if you are open to working on AI Risk reduction.

For people we have helped in our program, we typically see transition timelines of 6-18 months, but just today I talked with someone for whom it took two years. My colleague Moneer wrote about his experiences in getting into a position, which included taking 170 actions (like applications, 1-1s, projects). This can seem like a lot, but it comes down to 3 per week on average over a year.

You don't have to have an academic background to succeed (I never went to university myself), but regardless of your qualifications, be prepared for it to take time to find a position. In our advising, we emphasize the importance of building networks, conducting informational interviews, and getting more information on how to position oneself as a promising candidate.

That being said, you might be able to have a higher counterfactual impact if you find a position that is not in one of the well-known orgs - I would keep that in mind.

Talent pipelines. There’s no fellowship to train people to go directly into advocacy for AIS, compared to over 10 such efforts aimed at research. We’re training AIS researchers by the 100s, and leaving advocates to figure it out for themselves.

I'm not sure if this is different from what you meant, but we ran the first iteration of our AI Safety Advocacy Fellowship this year with promising results.

Load more