O

Osty

2 karmaJoined

Comments
2

Ok, but this still leaves unanswered the question of whether and to what degree you have a moral obligation to become better informed about the consequences of your actions. Many people are blissfully unaware of what happens in factory farms. Are they doing nothing (subjectively) wrong, or is there a sense in which we can say they "should have known better"? Can I absolve myself of subjective wrongness just by being an ignoramus?

You've written elsewhere that you think even ostensibly good or bad actions only have like a ~50.01% chance of actually having good or bad outcomes in the long run due to the butterfly effect. So let's say that a particular instance of torturing someone, due to a long, unexpected chain of cause and effect, ultimately leads to a flourishing future for everyone. So in your view, was this action then objectively morally right? Or do you say no, since it was bad in expectation, it was still objectively morally wrong despite its ultimate consequences?

If you go with the latter, which I suspect you will, then you must admit that actions themselves cannot be evaluated as right or wrong in a vacuum - you must also consider how knowledgeable the actor is about the consequences of their actions. Whether something is good or bad in expectation depends on your credence in various possible outcomes of the action. If the act of torture was committed by Laplace's demon himself because he really did know it would ultimately lead to a flourishing future, then it would be morally right.

Now obviously nobody can reasonably be expected to have Laplace's demon level of accuracy in their predictions of the consequences of all of their micromovements millions of years into the future. But short of that, what level of knowledge should people be expected to have? And can we ever blame them for making the wrong judgment because they lacked sufficient knowledge? I can imagine someone who is careless and just goes with their gut instinct all the time (in which case it feels appropriate to blame them even if their action was good in their expectation), or I can imagine someone who is thoughtful and reflective about their actions and does thorough research to understand the consequences as fully as possible.

You can say, of course, you should be thoughtful and reflective and do research, but how far are you willing to go with that? There are a bunch of morally gray choices where it is unclear what will lead to the best consequences but which you could theoretically eke out a few extra percentage points of confidence in the right answer by doing extensive research about it. Should everyone be expected to perform exhaustive research across many domains of knowledge before ever taking an action? Eventually, the cost of doing this research itself must be taken into account - at what point are you allowed to pull the trigger and take the action? Are the answers to these questions really objective, fundamental features of reality? That just seems bizarre to me.