In many ways, most EAs are extraordinarily smart, but in one way EAs are naive. The most well known EAs have stated that the goal of EA is to minimize suffering. I can't explain this well at all, but I'm certain that is not the cause or effect of altruism as I understand it.
Consider The Giver. Consider a world where everyone was high on opiates all the time. There is no suffering or beauty. Would you disturb it?
Considering this, my immediate reaction is to restate the goal of EA as maximizing the difference between happiness and suffering. This still seems naive. Happiness and suffering are so interwoven, I'm not sure this can be done. The disappointment from being rejected by a girl may help you come to terms with reality. The empty feeling in the pit of your stomach when your fantasy world crumbles motivates you to find something more fulfilling.
It's difficult to say. Maybe one of you can restate it more plainly. This isn't an argument against EA. This is an argument that while we probably do agree on what actions are altruistic--the criteria used to explain it are overly simplified.
I don't know if there is much to be gained by having criteria to explain altruism, but I am tired of "reducing suffering." I like to think about it more as doing what I can to positively impact the world--and using EA to maximize that positivity where possible. Because altruism isn't always as simple as where to send your money.
"I would rather have the beauty AND suffering as cause and effect"
If you're interested, here's a video that makes a strong case for why preserving the package-deal is an unconscionable view in a world like the one we find ourselves in, where nothing is guaranteed and where no limitations exist on the magnitudes of suffering: https://www.youtube.com/watch?v=RyA_eF7W02s
If you had endured any of the video's "Warning: Graphic Content" bits that other individuals endure routinely, I somehow doubt that you'd be as taken in by the lessons on display in the 'The Giver'.
Ideally, let's say that you, as an individual, get to forward-program your own misery-to-happiness ratio over the course of your life, ensuring that some suffering would still exist in the universe (as per your preference). If this were possible to do, would you still think it necessary to program other individuals' ratios? If everyone else picked total non-stop bliss for themselves, do you think it's morally appropriate to forcefully alter their preferences, because 'The Giver' has a certain (non-moral) charm to it?
Interesting. I watched 14 minutes of the video and I wonder if its possible to separate the two at all. Then again, the imaginary, opiate-using world seems to do that. It's sort of like Brave New World isn't it? Some people choose the charm of the package deal and others are content with their controlled environment. I guess you and Claire make valid points.