In the 2019 EA Survey, 70% of EAs "identified with" (it isn't clear what exactly the question was) utilitarianism. The ideology is closely related to EA. I can't find the source but was once told that some big shots (Toby Ord?) once wanted to call the movement "effective utilitarianism." The shortlist for naming the organization that would become the Centre for Effective Altruism apparently included the "Effective Utilitarian Community."
But I argue that utilitarianism is wrong and often motivated by a desire for elegant or mathematical qualities at the expense of obvious intuitions about particular cases.
At present this is unimportant since common EA conclusions are compatible with most normal moral views, but it could become a problem in the future if dogmatic utilitarianism conflicts in an important way with common-sense or rights-based ethics.
Full post: https://arjunpanickssery.substack.com/p/just-say-no-to-utilitarianism
Sometimes intuitions conflict, like if someone intuits
So we need principles, and I find the principle "consult your intuition about the specific case" unappealing because I feel more strongly about more abstract intuitions like the third one above (or maybe like universalizability or linearity) than intuitions about specific cases.