LR

Liam Robins

102 karmaJoined

Comments
1

Consider adopting the term o-risk.

William MacAskill has recently been writing a bunch about how if you’re a Long-Termist, it’s not enough merely to avoid the catastrophic outcomes. Even if we get a decent long-term future, it may still fall far short of the best future we could have achieved. This outcome — of a merely okay future, when we could have had a great future — would still be quite tragic.

Which got me thinking: EAs already have terms like x-risk (for existential risks, or things which could cause human extinction) and s-risk (for suffering risks, or things which could cause the widespread future suffering of conscious beings). But as far as I’m aware, there isn’t any term to describe the specific kind of risk that MacAskill is worried about here. So I propose adding the term o-risk: A risk that would make the long-term future okay, but much worse than it would optimally be.