S

sammyboiz

282 karmaJoined

Comments
43

Your reasoning is solid to me. great response.

I know that you state this as a reason that has not been addressed so your argument is probably not your main argument. But if you are using this as a main reason for going vegan, I feel like it misses the point. Maybe going vegan yourself makes it 20% easier for the next person to go vegan. That is still nowhere near the cost-effectiveness/effort-effectiveness of donating to animal welfare since the one estimate I listed was $1000 to offset a lifetime of veganism.

My question for you is why do you promote AW donation AND veganism. Do you think you can increase your EU by only advocating for AW donations? Do you care that others abide by deontological side-contraints?

Are people here against killing one to save two in a vacuum? I thought EA was very utilitarian. I think intuitively, causing harm is repulsive but ultimately, our goal should be creating a better world. 

 

To your "animal" to "human" swap, it's hard to give "would you kill/eat humans if you could offset" as an double standard since most self-proclaimed utilitarians are still intuitively repulsed to immoral behavior like causing harm to humans, cannibalism, etc. On the other hand, we are biologically programmed to not care when eating animal flesh, even if we deem animal suffering immoral. What this means is that I would be way to horrified to offset killing or eating a human even if I deem it moral. On the other hand, I can offset eating an animal because I don't intuitively care about the harm I caused. I am too disconnected, biologically preprogrammed, and cognitively dissonant. Therefore, offsetting animal suffering is not repulsive nor immoral to me.

Two of your reasons to go vegan involve getting to tell others you are vegan. I find this pretty dishonest because I assume you aren't telling them this.

So if they ask you, "why are you vegan?", your honest answer would be "because I need you to accept me as a non-hypocrite."????? I don't think vegans would give you any extra consideration if they knew this was your reasoning. Any other reason you give would be dishonest and misleading.

I see, so an EA deontologist would be an EA with a deontological side-constraint and who is otherwise utilitarian with his positive impact.

Im confused what other goal there is beside the maximization of expected utility.

As for being exclusive, I argue that effort needs to be prioritized according to importance.

Load more