Manuel Allgaier

850 karmaJoined 10365 Berlin, Germany

Bio

Former director of EA Germany and EAGxBerlin 2022 event lead, currently on a career break to explore some longtermist & AI Safety ideas as well as work on personal (non-public) projects

 

Bio: I worked full-time in EA movement building (funded by CEA) as Director of EA Berlin (2019-21) and Director of EA Germany (2021-22) and EAGxBerlin Event Lead (2022). Previously, I worked in sustainability consulting and charity management, studied environmental science, economics & IT and volunteered in lived and worked in Phnom Penh (Cambodia), Amsterdam and Berlin.

https://www.linkedin.com/in/manuelallgaier/

How others can help me

If you have any ideas for EA Berlin or would like to get involved, I'd be happy to hear from you! Just message me here on the forum or on Linkedin.

Feedback on me and my work is always welcome: bit.ly/ea_anonymous_feedback

Comments
119

Cool that you're doing this! I could share two failures, one in my career plans and one in job applications. I could do that in 1-4min, depending on how many other people want to share and how much time we have. Looking forward! :)

Nice map! Do you want to upload this on some website, so people can share and find it easier? (similar to aisafety.world) Could be worth investing a tiny bit of money to buy such a domain? 

One additional reason:

If you get your (initial) training from a neutral-ish impact organisation, like some management consulting or tech companies, and then move on to a high impact job, you can add value right away with less 'training costs' for the high impact org = more impact.

All else equal, an EA org with staff with 1-3 years (non-EA) job experience can achieve more impact quicker than one with partly inexperienced staff.

That said, some things such as good epistemics or high moral integrity may be easier to learn at EA orgs (though they can definitely also be learned elsewhere).

I've supported >100 people in their career plans, and this seems pretty solid but underappreciated advice. Thanks for writing it up!

I think I made that mistake too. I went for EA jobs early in my career (running EA Berlin and then EA Germany 2019-22, funded by CEA grants). There were some good reasons: This work seemed particularly neglected in 2019-21, it seemed a good fit and three senior people I had in-depth career 1-1s with all recommended it. I learned a lot, met many inspiring people and I think I did had some significant positive impact as well, on the community overall (it grew and professionalized) and on some individual member's careers.

However, I made a lot of mistakes too, had slow feedback loops (no manager, little mentorship), and I'm pretty sure I would have learned many (soft) skills faster and built overall better career capital (both in- and outside of EA), if I had first spent 1-2 years in management consulting or in a fast growing (non-EA) tech company with good management, and then went on to direct EA work.

I agree that it would be good to have citations. In case neither Ozzie nor anyone else here finds it a good use of their time to do it - I've been following OpenAIs and Sam Altman's messaging specifically for a while and Ozzie's summary of their (conflicting) messaging seems roughly accurate to me. It's easy to notice the inconsistencies in Sam Altman's messaging, especially when it comes to safety. 

Another commenter (whose name I forgot, I think he was from CLTR) put it nicely: It feels like Altman does not have one consistent set of beliefs (like an ethics/safety researcher would) but tends to say different things that are useful for achieving his goals (like many CEOs do), and he seems to do that more than other AI lab executives at Anthropic or Deepmind. 

This could be a community effort. If you're reading this and have a spare minute, can you recall any sources for any of Ozzie's claims and share links to them here? (or go the extra mile, copy his post in a google doc and add sources there?). 

It's easy to vote for something you don't have to pay for. If we do anything like this, an additional fundraiser to pay for it might be appropriate.

Earning to Give still seems the best way to contribute for many people (e.g. people with exceptionally high earning potential, or people with decently paying jobs who aren't a good fit for direct work or don't want to switch jobs). I don't think we should distance ourselves from it. 

While I'm also interested in the finances, I fully understand if they prefer not to share all this info publicly. Afaik it's not common to share such detailed financial statements publicly, even for non-profits. 

+1, I'd find this very useful too! 

For context: After working full-time in EA meta for >3 years, I've been thinking about renting or buying property in/near Berlin or in a cheaper place in Europe to facilitate EA/longtermist events, co-working and maybe also co-living. I know many others are thinking about this too, some of whom area already making plans, and such retrospectives would be really helpful to inform our decisions. If you prefer not to share it publicly, you can also email me

From the limited info I have, Wytham Abbey seemed a good idea at the time, and I appreciate you going for it! The decision to sell was probably pretty hard to make, I hope all involved feel good about it now. 

Load more