I'm aware of Moral Uncertainty and the moral parliament model, as well as this (uncomplete) sequence by MichaelA, but I'm not sure what concrete actions moral uncertainty entails.
What specific actions should someone take if they are highly uncertain about the validity of different ethical theories?
Probably something like striving for a Long Reflection process. (Due to complex cluelessness more generally, not just moral uncertainty.)
The real issue is unrealistic levels of coordination and a assumption that moral objectivism is true. While it is an operating assumption in order to do anything in EA, that doesn't equal that's it's true.