There is dispute among EAs--and the general public more broadly--about whether morality is objective. So I thought I'd kick off a debate about this, and try to draw more people into reading and posting on the forum! Here is my opening volley in the debate, and I encourage others to respond.
Unlike a lot of effective altruists and people in my segment of the internet, I am a moral realist. I think morality is objective. I thought I'd set out to defend this view.
Let’s first define moral realism. It’s the idea that there are some stance independent moral truths. Something is stance independent if it doesn’t depend on what anyone thinks or feels about it. So, for instance, that I have arms is stance independently true—it doesn’t depend on what anyone thinks about it. That ice cream is tasty is stance dependently true; it might be tasty to me but not to you, and a person who thinks it’s not tasty isn’t making an error.
So, in short, moral realism is the idea that there are things that you should or shouldn’t do and that this fact doesn’t depend on what anyone thinks about them. So, for instance, suppose you take a baby and hit it with great force with a hammer. Moral realism says:
1. You’re doing something wrong.
2. That fact doesn’t depend on anyone’s beliefs about it. You approving of it, or the person appraising the situation approving of it, or society approving of it doesn’t determine its wrongness (of course, it might be that what makes its wrong is its effects on the baby, resulting in the baby not approving of it, but that’s different from someone’s higher-level beliefs about the act. It’s an objective fact that a particular person won a high-school debate round, even though that depended on what the judges thought).
Moral realism says that some moral statements are true and this doesn’t depend on what people think about it. Now, there are only three possible ways any particular moral statement can fail to be stance independently true:
1. It’s
David Rubinstein recently interviewed Philippe Laffont, the founder of Coatue (probably worth $5-10b). When asked about his philanthropic activities, Laffont basically said he’s been too busy to think about it, but wanted to do something someday. I admit I was shocked. Laffont is a savant technology investor and entrepreneur (including in AI companies) and it sounded like he literally hadn’t put much thought into what to do with his fortune.
Are there concerted efforts in the EA community to get these people on board? Like, is there a google doc with a six degrees of separation plan to get dinner with Laffont? The guy went to MIT and invests in AI companies. In just wouldn’t be hard to get in touch. It seems like increasing the probability he aims some of his fortune at effective charities would justify a significant effort here. And I imagine there are dozens or hundreds of people like this. Am I missing some obvious reason this isn’t worth pursuing or likely to fail? Have people tried? I’m a bit of an outsider here so I’d love to hear people’s thoughts on what I’m sure seems like a pretty naive take!
https://youtu.be/_nuSOMooReY?si=6582NoLPtSYRwdMe
I've been thinking about this issue recently too. I think it's pretty clear in the case of Warren Buffet and other ultra-wealthy.
Generally, I think EAs sort of live and breath this stuff, and billionaires/major donors are typically in a completely different world, and they generally barely care about it.
I've been asking around about efforts to get more rich donors. I think Longview is often heralded as the biggest bet now, though of course it's limited in size. My guess is that there should be much more work done here - though at the same time - I think that this sort of work is quite difficult, thankless, risky (very likely to deliver no results), is often a big culture clash, etc.
Like, we need to allocate promising people to spend huge amounts of time with a lot of mostly-apathetic and highly selfish (vs. what we are used to around EA) people, with a high likelihood of seeing no results after 5-30 years.
I think this is in the vein of what Jack Lewars is doing?
https://www.linkedin.com/company/ultraphilanthropy/
Could be!
I assume the space is big enough, it could another absorb 20-60 people plus.
I've also heard of some other high network projects coming from Charity Entrepreneurship, but I haven't investigated.
Thanks for pointing this out :)
I think longview philanthropy might look after HNW individuals in the EA space?