When comparing different problems, I found myself with moral questions about what we should care about, whose answers had a direct impact on the scale of global problems. 80k suggests a concept called moral uncertainty, which consists on maximizing the expected value of our decisions. For that it is necessary to estimate the probability of moral assertions (for example: non-human animal lives matter equally as human ones), but I haven't found resources for that. I would be very grateful if someone had a clue on how to proceed.
Moral dilemmas should never be an obstacle to making moral decisions. Morality, above all, is a way of life, that is, it is "practice of virtue." Considering a moral dilemma must be done in the context of a moral attitude within a cultural conception. Errors or exceptions constantly appear in moral dilemmas. Should I save the lives of a million chickens even at the cost of the life of a human being? In my opinion, your action of saving a million chickens at the cost of a human life would not be considered virtuous in the culture in which you find yourself. And you would not be a virtuous person if the attitude of the majority were indifferent to you to that extent.
All moral progress implies nonconformity, an overcoming of the resistance of the majority, but this virtuous action must imply a lifestyle in accordance with virtue itself in understandable terms.
For example, conscientious objection to military service may be considered a betrayal of a politically respectable ideal (as in Ukraine now invaded by Russia) and an immoral act for most people... but if the conscientious objector expresses his commitment to pacifism, altruism and benevolence in a convincing way, he will still be within the realm of comprehensible virtue and may lead to moral progress. And that does not imply that a moral dilemma has been resolved.