I recently looked through the current version of the virtual groups intro syllabus and was disappointed to see zero mention of s-risks within the sections on longtermism/existential risk. I think this is a symptom of a larger problem, where “longtermism” has come to mean a very particular set of future-oriented projects (primarily extinction risk reduction) that primarily derive from a very particular set of values (primarily classical utilitarianism). As facilitators responsible for introducing people to the ideas of EA, I think it’s important for us to diversify our readings and discussions to account for multiple reasonable starting positions. For a start, I suggest that we rework the week on existential risk to have a more general focus on cause prioritization in longtermism, including readings and discussions on the topic of s-risks.
More generally, I think that we should take the threat of groupthink very seriously. The best-funded and most influential parts of the EA community have come to prioritize a particular worldview and value system that is not necessarily definitive of EA, and one that reasonable people in the community could disagree with. Throughout my experience as a student organizer, I've seen many of my peers just defer to the views and values supported by organizations like 80,000 Hours without reflecting much on their own positions, which strikes me as quite problematic given that many want to represent EA as a question, not an ideology. Failing to include a broader range of ideas and topics in introductory fellowships only exacerbates this problem of groupthink.
I’d love to talk more about how we can diversify the range of views represented to newcomers, and in particular how we can “diversify longtermism.”
Reading your suggested required readings, S-risks: An introduction and S-risk FAQ, I don't get a clear sense of why s-risks are plausible or why the suggested interventions are useful. I like S-risks: Why they are the worst existential risks, and how to prevent them (EAG Boston 2017) a bit more for illustrating why they are plausible, and I've added it as an optional reading in the uni chapter intro program I'm running. Unfortunately, it doesn't give more than a cursory overview of how s-risks could be reduced. I'd be hesitant about making an s-risk reading a required reading though since I don't know about high-quality intro-level material about s-risks for participants to learn more. I also expect that s-risks might provoke a lot of discussion, so we would want to make sure discussion facilitators have the resources to be well-informed about the issues. Right now, I wouldn't feel confident discussing s-risks with intro program participants.
By the way, if you want to make your suggestions as easy as possible to add it to the curriculum, you should also suggest discussion questions that facilitators can ask about the topic.
(Note: I'm not in charge of the EA Virtual Programs syllabus.)