Hide table of contents

I won't moralize about Elon Musk as a personality but what should be of greater importance to effective altruists anyway is how the impacts of all his various decisions are, for lack of better terms, high-variance, bordering on volatile. From the outside view, there are stock answers to the question of why he would be like this:

  •  An ultra-talented person whose success keeps generalizing to more domains may become revered by others that have them conclude that person is a universal genius, who then gets caught in an echo chamber of yes men and loses self-awareness of when he is super wrong.
  • A person gaining celebrity, wealth and power can exacerbate that trend because there are higher stakes. This doesn't even necessarily mean that someone in Musk's position would become intellectually corrupted or deeply out of touch with reality. The wealth and power of many others depend on what someone in his position does, such as investors backing Musk's companies compelling him to make changes to his businesses or personal brand against his own preferences. Those around Musk backing him the most have in a lot of ways more total influence than he does himself, so they will try shaping Musk's social network in a way that pushes him in the direction they want. 
  • Someone having an outsized impact means the negative impacts of their own biases will have an outsized impact too. It's more noticeable when one of the wealthiest and most famous people worldwide makes mistakes. All kinds of other people in Musk's position might make mistakes way more and way worse than him.

Problems like these are recognized in EA already. During the first few years, one of the growing pains in EA was learning to recognize when a top, young Ph.D. with no experience outside academia shouldn't be the CEO of an organization doing something totally different from academic research. Another one of the growing pains was learning how to point that out to people in EA in positions of status, authority, control over resources, etc

This isn't a snide jab at Will MacAskill. He in fact recognized this problem before most and has made the wise choice of not being the CEO of the CEA for a decade now even though he could have kept the job forever if he wanted. This is a general problem in EA of many academics having to repeatedly learn they have little to no comparative advantage, if not a comparative disadvantage, in people and operations management. The fact that there is such a fear of criticizing the decisions or views of high-status leaders, someone like Holden Karnofsky, in EA that it's now a major liability to the movement. Meanwhile, Holden writes entire series of essays trying to make transparent his own reasoning of why he oversees an organization that hires a hundred people to tell Holden how the world really works and how to do the most good in umpteen different ways.

Some of the individuals about who there is the greatest concern that may end up in a personality cult, information silo, or echo chamber, like Holden, are putting in significant effort to avoid becoming out of touch with reality and minimizing any negative, outsized impact of their own biases. Yet it's not apparent if Musk makes any similar efforts. So, what, if any, are the reasons specific to Musk as a personality causing him to be so inconsistent in the ways effective altruists should care about most?

8

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

I'm having a bit of trouble reading between the lines here.

Is this post complaining about Elon Musk taking over Twitter, as if that's a bad thing? Or about him being outspoken and controversial in general?

There is a highly coordinated smear campaign against Elon Musk happening now across many news outlets, from people who are politically opposed to  free speech. But I do not think that EAs should take the smear campaign very seriously.

I acknowledged in some other comments that I wrote this post sloppily, so I'm sorry for the ambiguity. Musk's recent purchase of Twitter and its ongoing consequences is part of why I've made this post. It's not about it being bad that he bought Twitter. The series of mistakes that has

It's not about him being outspoken and controversial. The problem is Musk's not being sufficiently risk-averse and potentially having blindspots that could have a significant negative impact on his EA-related/longtermist efforts.

Agree with Geoffrey that it is very hard to understand this post without examples of what is meant by Elon's "calibration".  What do you mean in your very last sentence: "what, if any, are the reasons specific to Musk as a personality causing him to be so inconsistent in the ways effective altruists should care about most"?  Please give some examples -- are you implying that buying Twitter in the hopes of making conversation freer and more rational is not a good EA cause area?  Or implying that maybe it is a good EA cause area, but Musk is a terrible person to run said project?  Or implying that Musk's other projects, like SpaceX and Tesla, are a waste of effort from an EA perspective?  (I would remind you that Elon's goal has not just been to work on the most important possible cause areas with the money he has, but to found profitable companies that make progress on important-ish causes, such that he can get more money to roll into more important causes in the future.  Evidently one can make more lots of money in electric car manufacturing that you can't make in bednet distribution or lobbying for better pandemic preparedness policy.)  Maybe you agree with my parenthetical, but you think that Twitter will not be a moneymaking proposition for Elon, or you think that he should give up on trying to get richer and richer and switch now to working on the most important EA causes. 

About twitter, I would note that Elon has been in charge for just a few days -- I don't think it's clear yet whether Elon had an "uncalibrated" sense of his capabilities and will ruin Twitter through incompetence, or if he will succeed at improving it.  Maybe after a few months or a few years, the answer of whether Musk's ownership has been good or bad for Twitter will be more clear.

More generally, I would think that many attempts to launch billion-dollar companies are subject to "high variance" -- that is just an unfortunate fact of life when you are trying to do ambitious things.  Many of Elon's companies have been close to bankruptcy at one point or another, but so far they have made it through.  Conversely, nobody doubts that Sam Bankman-Fried is a very smart guy, but FTX (although it may have been very close to succeeding and becoming even bigger than it was) is currently being forced to sell itself to Binance for pennies on the dollar.

Personally, I take pride in the EA community's enthusiasm for "hits-based giving", and its willingness to consider low-probability, high-consequence events seriously.  Unfortunately, taking action in this complex world requires making decisions under high uncertainty (including uncertainty about one's own capabilities and strengths/weaknesses).  For instance, I aspire to someday found an EA-aligned charitable organization, even though my only previous job experience has been as an aerospace engineer.  It's possible that I am deluded about my personal charity-running capacities, and it's possible that I'm furthermore deluded such that I'll never be able to recognize the ways in which I'm deluded about my charity-running capacities.  But I think in this situation, it is often reasonable to go ahead and found the charity anyways -- otherwise fear and uncertainty will preclude any ambitious action!  As Nathan Young says about SBF and the implosion of FTX -- "It is unclear if ex-ante this was a bad call from them. There is lots we don't know."

None of Musk's projects are by themselves bad ideas. None of them are obviously a waste of effort either. I agree the impacts of his businesses are mostly greater than the impact of his philanthropy, while the opposite is presumably the case for most philanthropists in EA. 

I agree his takeover of Twitter so far doesn't strongly indicate whether Twitter will be ruined. He has made it much harder for himself to achieve his goals with Twitter, though, through a series of many mistakes he has made during the last year in the course of buying Twitter.

The p... (read more)

Comments11
Sorted by Click to highlight new comments since:

Why is this relevant to the EA forum?

There are writing issues and I'm not sure the net value of the post is positive.

But your view seems ungenerous, ideas in paragraphs like this seem relevant:

This isn't a snide jab at Will MacAskill. He in fact recognized this problem before most and has made the wise choice of not being the CEO of the CEA for a decade now even though he could have kept the job forever if he wanted. 

This is a general problem in EA of many academics having to repeatedly learn they have little to no comparative advantage, if not a comparative disadvantage, in people and operations management.

Some of the individuals about who there is the greatest concern that may end up in a personality cult, information silo, or echo chamber, like Holden, are putting in significant effort to avoid becoming out of touch with reality and minimizing any negative, outsized impact of their own biases. 

Yet it's not apparent if Musk makes any similar efforts. So, what, if any, are the reasons specific to Musk as a personality causing him to be so inconsistent in the ways effective altruists should care about most?


 

I understood the heart of the post to be in the first sentence: "what should be of greater importance to effective altruists anyway is how the impacts of all [Musk's] various decisions are, for lack of better terms, high-variance, bordering on volatile." While Evan doesn't provide examples of what decisions he's talking about, I think his point is a valid one: Musk is someone who is exceptionally powerful, increasingly interested in how he can use his power to shape the world, and seemingly operating without the kinds of epistemic guardrails that EA leaders try to operate with. This seems like an important development, if for no other reason that Musk's and EA's paths seem more likely to collide than diverge as time goes on.

What you said seems valid. However, unfortunately, it seems low EV to talk a lot about this subject. Maybe the new EA comms and senior people are paying attention to the issues, and for a number of reasons that seems best in this situation. If that's not adequate, it seems valid to push or ask them about it.

I'm thinking of asking people like that about what they're doing but I'm also intending to request feedback from them and others in EA how to communicate related ideas better. I've asked this question to check if there are major factors I might be missing as a prelude to a post with my own views. That'd be high stakes enough that I'd put in the effort to write it better that I didn't put into this question post. I might title it something like "Effective Altruism Should Proactively Help Allied/Aligned Philanthropists Optimize Their Marginal Impact."

Other than at the Centre of Effective Altruism, who are the new/senior communications staff it'd be good to contact?

Strongly upvoted. You've put my main concern better than I knew how to put it myself.

As I write in my answer above, I think high-variance and volatile decisions are kinda just the name of the game when you are trying to make billions of dollars and change industries in a very-competitive world.

Agreed that Musk is "operating without the kinds of epistemic guardrails that EA leaders try to operate with", and that it would be better if Musk was wiser.  But it is always better if people were wiser, stronger versions of themselves!  The problem is that people can't always change their personalities very much, and furthermore it's not always clear (from the inside) which direction of personality change would be an improvement.  The problem of "choosing how epistemically modest I should be", is itself a deep and unsettled question.

(Devil's advocate perspective: maybe it's not Musk that's being too wild and volatile, but EAs who are being too timid and unambitious -- trying to please everyone, fly under the radar, stay apolitical, etc!  I don't actually believe this 100%, but maybe 25%: Musk is more volatile than would be ideal, but EA is also more timid than would be ideal.  So I don't think we can easily say exactly how much more epistemically guard-railed Musk should ideally be, even if we in the EA movement had any influence over him, and even if he had the capability to change his personality that much.)

I agree that Musk should have more epistemic guardrails but also that EA should me more ambitious and not less timid, but more tactful. Trying to always please everyone, be apolitical and fly under the radar can constitute an extreme risk aversion, a risk in itself.

Musk has for years identified that one of the major motivators for most of his endeavours is to ensure civilization is preserved.

From EA convincing Elon Musk to take existential threats from transformative AI seriously almost a decade ago, to his recent endorsement of longtermism and William MacAskill's What We Owe the Future on Twitter for millions to see, the public will perceive a strong association him and EA.

He also continues to influence the public response to potential existential threats like unaligned AI and the climate crisis, among others. Even if Musk has more hits than misses, his track record is mixed enough that it's worth trying to notice any real patterns across his mistakes so the negative impact could be mitigated. Given Musk's enduring respect for EA, the community may be better able than most to inspire him to make better decisions in the future as it relates to having a positive social impact, i.e., become better at calibration.

Thanks for the response. I still do not think the post made it clear what its objective was, and I don't think it's really the venue for this kind of discussion.

I meant the initial question literally and sought an answer. I listed some general kinds of answers and clarified that I'm seeking answers to what potential factors may be shaping Musk's approaches that would not be so obvious. I acknowledge I could have written that better and that the tone makes it ambiguous whether I was trying to slag him disguised as me asking sincere question.

Curated and popular this week
Relevant opportunities