The discussion of AI risk is only recently mainstream and therefore amateurs made contributions within the past decade. I think this experience exacerbates the self-assuredness of nuke risk amateurs and leads them to not bother researching the expertise of the nuke community.
For decades, many experts have worked on nuke strategy, and have come up with at least a few risk-reducing paradigms:
- Arms control can work, and nations can achieve their achievable nuke goals (eg deterrence and maybe compellance) despite lower nuke counts and can save money doing so.
- Counterforce is (arguably) better than countervalue
- Escalation is arguably a ladder, not a binary on/off switch
Based on its history of at least partial risk-reducing success, academically rigorous argument, and the sheer number of thoughtful hours spent, the establishment nuke community has probably done a decent job and improvements are probably hard to find. One place to start is the book Wizards of Armageddon by Fred Kaplan. It isn't the best book on nuke strategy generally, but it focuses on the history of the nuke community, so it will hopefully engender at least some respect for the nuke community and inspire further reading.
I think that in a field as well-established as nuke risk, improvements are more likely to be made on top of the existing field rather than by trying to re-invent the field.
Post-script: The EA community is criticized as unusefully amateurish in a recent podcast by a nuke professional https://www.armscontrolwonk.com/archive/1216048/the-wizards-of-armageddon/ but he does mention some positive work by Peter Scoblic, which I believe is https://forum.effectivealtruism.org/posts/W8dpCJGkwrwn7BfLk/nuclear-expert-comment-on-samotsvety-nuclear-risk-forecast-2
Well, this isn’t how I wanted to start my engagement with the EA community.
I wouldn’t call the efforts of the EA community amateurish; if I said it or implied it, I am wrong. I am actually really happy you exist.
Other things I actually think:
We need to do better – both providing better data and providing data that you need -- but I am slightly freaked out about the size of the gap we need to close. I want to close that gap and I am kind of bummed if the way I said that in the podcast makes that less likely.
TL;DR: I don’t think you suck, I think you are poorly served by those of us who make your data.
Thanks Jeffrey! I hope we're a community where it doesn't matter so much whether you think we suck. If you think the EA community should engage more with nuclear security issues and should do so in different ways, I'm sure people would love to hear it. I would! Especially if you'd help answer questions like: How much can work on nuclear security reduce existential risk? What kind of nuclear security work is most important from an x-risk perspective?
I'd love to hear more about what your concerns and criticisms are. For example, I'd love to know: Is the Scob... (read more)