Note: Aaron Gertler, a Forum moderator, is posting this with Toby's account. (That's why the post is written in the third person.)
This is a Virtual EA Global AMA: several people will be posting AMAs on the Forum, then recording their answers in videos that will be broadcast at the Virtual EA Global event this weekend.
Please post your questions by 10:00 am PDT on March 18th (Wednesday) if you can. That's when Toby plans to record his video.
About Toby
Toby Ord is a moral philosopher focusing on the big picture questions facing humanity. What are the most important issues of our time? How can we best address them?
His earlier work explored the ethics of global health and global poverty, which led him to create Giving What We Can, whose members have pledged hundreds of millions of pounds to the most effective charities helping to improve the world. He also co-founded the wider effective altruism movement.
His current research is on avoiding the threat of human extinction, which he considers to be among the most pressing and neglected issues we face. He has advised the World Health Organization, the World Bank, the World Economic Forum, the US National Intelligence Council, the UK Prime Minister’s Office, Cabinet Office, and Government Office for Science. His work has been featured more than a hundred times in the national and international media.
Toby's new book, The Precipice, is now available for purchase in the UK and pre-order in other countries. You can learn more about the book here.
In your book, you define an existential catastrophe as "the destruction of humanity's longterm potential". Would defining it instead as "the destruction of the vast majority of the longterm potential for value in the universe" capture the concept you wish to refer to? Would it perhaps slightly more technically accurately/explicitly capture what you wish to refer to, just perhaps in a less accessible or emotionally resonating way?
I wonder this partly because you write:
It also seems to me that "the destruction of the vast majority of the longterm potential for value in the universe" would seem to be meaningfully more similar to what I'm really interested in avoiding than the destruction of humanity's potential if/when AGI, aliens, or other intelligent life evolving on earth becomes or is predicted to become an important shaper of events (either now or in the distant future).