We've had a lot of votes on the banner! If you'd like to explain why you voted the way you did, what your cruxes[1] are, and what would change your mind, comment in this thread.
You can also mention if you'd be open for having a dialogue with another Forum user who disagrees with you. If someone comments and offers to dialogue with you, you can set up a time to write a dialogue together (perhaps via Forum dms).
To find out more about the event, and how to contribute, read the announcement post.
- ^
Beliefs or assumptions which determine your overall opinion, but are better targets for argument/ you would more easily change your mind on. For example, one of mine is "philosophy of mind doesn't make progress".
To my mind, the first point applies to whatever resources are used throughout the future, whether it’s just the earth or some larger part of the universe.
I agree that the number/importance of welfare subjects in the future is a crucial consideration for how much to do longtermist as opposed to other work. But when comparing longtermist interventions—say, splitting a budget between lowering the risk of the world ending and proportionally increasing the fraction of resources devoted to creating happy artificial minds—it would seem to me that the “size of the future” typically multiplies the value of both interventions equally, and so doesn’t matter.