I won't moralize about Elon Musk as a personality but what should be of greater importance to effective altruists anyway is how the impacts of all his various decisions are, for lack of better terms, high-variance, bordering on volatile. From the outside view, there are stock answers to the question of why he would be like this:
- An ultra-talented person whose success keeps generalizing to more domains may become revered by others that have them conclude that person is a universal genius, who then gets caught in an echo chamber of yes men and loses self-awareness of when he is super wrong.
- A person gaining celebrity, wealth and power can exacerbate that trend because there are higher stakes. This doesn't even necessarily mean that someone in Musk's position would become intellectually corrupted or deeply out of touch with reality. The wealth and power of many others depend on what someone in his position does, such as investors backing Musk's companies compelling him to make changes to his businesses or personal brand against his own preferences. Those around Musk backing him the most have in a lot of ways more total influence than he does himself, so they will try shaping Musk's social network in a way that pushes him in the direction they want.
- Someone having an outsized impact means the negative impacts of their own biases will have an outsized impact too. It's more noticeable when one of the wealthiest and most famous people worldwide makes mistakes. All kinds of other people in Musk's position might make mistakes way more and way worse than him.
Problems like these are recognized in EA already. During the first few years, one of the growing pains in EA was learning to recognize when a top, young Ph.D. with no experience outside academia shouldn't be the CEO of an organization doing something totally different from academic research. Another one of the growing pains was learning how to point that out to people in EA in positions of status, authority, control over resources, etc.
This isn't a snide jab at Will MacAskill. He in fact recognized this problem before most and has made the wise choice of not being the CEO of the CEA for a decade now even though he could have kept the job forever if he wanted. This is a general problem in EA of many academics having to repeatedly learn they have little to no comparative advantage, if not a comparative disadvantage, in people and operations management. The fact that there is such a fear of criticizing the decisions or views of high-status leaders, someone like Holden Karnofsky, in EA that it's now a major liability to the movement. Meanwhile, Holden writes entire series of essays trying to make transparent his own reasoning of why he oversees an organization that hires a hundred people to tell Holden how the world really works and how to do the most good in umpteen different ways.
Some of the individuals about who there is the greatest concern that may end up in a personality cult, information silo, or echo chamber, like Holden, are putting in significant effort to avoid becoming out of touch with reality and minimizing any negative, outsized impact of their own biases. Yet it's not apparent if Musk makes any similar efforts. So, what, if any, are the reasons specific to Musk as a personality causing him to be so inconsistent in the ways effective altruists should care about most?
I acknowledged in some other comments that I wrote this post sloppily, so I'm sorry for the ambiguity. Musk's recent purchase of Twitter and its ongoing consequences is part of why I've made this post. It's not about it being bad that he bought Twitter. The series of mistakes that has
It's not about him being outspoken and controversial. The problem is Musk's not being sufficiently risk-averse and potentially having blindspots that could have a significant negative impact on his EA-related/longtermist efforts.