Edit: To clarify, when I say "accept Pascal's Wager" I mean accepting the idea that way to do the most (expected) good is to prevent as many people as possible from going to hell, and cause as many as possible to go to heaven, regardless of how likely it is that heaven/hell exists (as long as it's non-zero).
I am a utilitarian and I struggle to see why I shouldn't accept Pascal's Wager. I'm honestly surprised there isn't much discussion about it in this community considering it theoretically presents the most effective way to be altruistic.
I have heard the argument that there could be a god that reverses the positions of heaven and hell and therefore the probabilities cancel out, but this doesn't convince me. It seems quite clear that the probability of a god that matches the god of existing religions is far more likely than a god that is the opposite, therefore they don't cancel out because the expected utilities aren't equal.
I've also heard the argument that we should reject all infinite utilities – for now it seems to me that Pascal's Wager is the only example where the probabilities don't cancel out, so I don't have any paradoxes or inconsistencies, but this is probably quite a fragile position that could be changed. I also don't know how to go about rejecting infinite utilities if it turns out I have to.
I would obviously love to hear any other arguments.
Thanks!
I think imagining that current view X is justified, because one imagines that future generations will also believe in X is really unconvincing.
I think most people think their views will be more popular in the future. Liberal democrats and Communists have both argued that their view would dominate the world. I don't think it adds anything other than illustrating the speaker is very confident of the merits of their worldview.
If for instance, demographers put together an amazing case that most future humans would be Mormon, would you change your mind? If you became convinced that AI would kill humanity next decade and we're in the last generation so there are no future humans, would you change your mind?