It seems like the distribution of wealth (and other forms of power) following transformative AI is very consequential for the value of the long-term future, affecting both immediate suffering and well-being and the chances of a long-term value lock-in. I'm aware of the Windfall Clause Report and the three papers and several posts cited here. What other work should I be aware of?
Sam Altman has written up some of his related views: Moore's Law for Everything
He is also involved in Worldcoin, which aims to bootstrap a global currency by giving a share to every person.