I have not researched longtermism deeply. However, what I have found out so far leaves me puzzled and skeptical. As I currently see it, you can divide what longtermism cares about into two categories:
1) Existential risk.
2) Common sense long-term priorities, such as:
- economic growth
- environmentalism
- scientific and technological progress
- social and moral progress
Existential risk isn’t a new idea (relative to longtermism) and economic growth, environmentalism, and societal progress aren’t new ideas either. Suppose I already care a lot about low-probability existential catastrophes and I already buy into common sense ideas about sustainability, growth, and progress. Does longtermism have anything new to tell me?
Longtermism suggests a different focus within existential risks, because it feels very differently about "99% of humanity is destroyed, but the remaining 1% are able to rebuild civilisation" and "100% of humanity is destroyed, civilisation ends", even though from the perspective of people alive today these outcomes are very similar.
I think relative to neartermist intuitions about catastrophic risk, the particular focus on extinction increases the threat from AI and engineered biorisks relative to e.g. climate change and natural pandemics. Basically, total extinction is quite a high bar, and most easily reached by things deliberately attempting to reach it, relative to natural disasters which don't tend to counter-adapt when some survive.
Longtermism also supports research into civilisational resilience measures, like bunkers, or research into how or whether civilisation could survive and rebuild after a catastrophe.
Longtermism also lowers the probability bar that an extinction risk has to reach before being worth taking seriously. I think this used to be a bigger part of the reason why people worked on x-risk when typical risk estimates were lower; over time, as risk estimates increased. longtermism became less necessary to justify working on them.
Sorry for replying to this ancient post now. (I was looking at my old EA Forum posts after not being active on the forum for about a year.)
Here's why this answer feels unsatisfying to me. An incredibly mainstream view is to care about everyone alive today and everyone who will be born in the next 100 years. I have to imagine over 90% of people in the world would agree to that view or a view very close to that if you asked them.
That's already a reason to care about existential risks and a reason people do care about what they perceive as existential risks o... (read more)