Peter Thiel & Eric Weinstein discuss global catastrophic risks, including biosecurity and AI alignment, starting at around the 2:33:00 mark of Thiel's interview on Weinstein's new podcast.
tl;dl – Thiel thinks GCRs are a concern, but is also very worried about political violence / violence perpetrated by strong states. He thinks catastrophic political violence is much more likely than GCRs like AI misalignment.
He has some story about political violence becoming more likely when there's no economic growth, and so is worried about present stagnation. (Not 100% sure I'm representing that correctly.)
Also there's an interesting bit about transparency & how transparency often becomes weaponized when put into practice, soon after the GCR discussion.
Robin Hanson's latest (a) is related.
Given the stakes, it's a bit surprising that "has risk of war secularly declined or are we just in a local minimum?" hasn't received more attention from EA.
Holden looked at this (a) a few years ago and concluded:
If I recall correctly, Pinker also spent some time noting that violence appears to be moving to more of a power-law distribution since the early 20th Century (fewer episodes, magnitude of each episode is much more severe).
"War aversion" seems like a plausible x-risk reduction focus area in its own right (it sorta bridges AI risk, biosecurity, and nuclear security).
This chart really conveys the concern at a glance:
(source) (a)
... what if the curve swings upward again?