The institute is called Käte Hamburger Centre for Apocalyptic and Post-Apocalyptic Studies and is based in Heidelberg, Germany. They started in 2021 and initially received 9 million € of funding from the German government for the first four years.
AFAICT, they study sociological aspects of narratives of apocalypses, existential risks, and the end of the world.
They have engaged with EA thinking, and I assume they will have an interesting outside perspective of some prevalent worldviews in EA. For example, here is a recorded talk about longtermism (I have only skipped through it so far), which mentions MIRI, FHI, and What We Owe The Future.
I stumbled upon this today and thought it could interest some people here. Generally, I am very curious to learn more about alternative worldviews to EA that also engage with existential risk in epistemically sound ways. One criticism of EA that became more popular over the last months is that EA organizations engage too little with other disciplines and institutions with relevant expertise. Therefore, I suggest checking out the work of this Centre.
Please comment if you have engaged with them before and know more than I do.
Appreciate the pushback, but also think the upvotes are likely not representative of EAs reflexively thinking every criticism is great no matter how useless/uncharitable/etc. I think just from skimming this post, it was reasonable for me to have the reaction "Nice, some more people interested in x-risks/dystopias, and studying it from a critical sociological perspective, whatever that means. [Looking into it for 2 minutes without spotting anything really interesting] Thanks for sharing.", and that that's more representative of the upvotes.
E.g. one example for EAs responding in a calibrated way to criticism is imo this recent thread about a book that's very critical of EA and that seems to receive appropriate pushback for its flaws: https://forum.effectivealtruism.org/posts/YFGkyDjKvsr9tHzkS/book-post-the-good-it-promises-the-harm-it-does-critical
That said, I also am now and then surprised by the amount of upvotes some EA criticism content here gets despite what I perceive to be relatively low usefulness. I'm probably convinced though that this collective behavior is optimal given that criticism is hard and feels socially abrasive and adversarial, and to encourage it coming forward we should bias towards upvoting even just for approving the energy somebody put into the general process of improving EA. EAs as a community should just also realize that a critical post having 400+ upvotes doesn't necessarily reflect agreement or quality. (I'm also a fan of the idea to introduce agreement votes for posts themselves.)