JK

Jasnah Kholin

3 karmaJoined

Comments
2

"It seems obvious, though, that it would not be prudentially rational for Tim to make either choice."

It is indeed seem obvious - to the other direction! people make this choice, and encourage others  to make this choice, all the time!

you can disagree with the mainstream view here, but there is something that is very disconnected from... normal society, in this claim.

The Tortured Toms of the world actually exist, mostly don't want to die, and a lot of time will object at any attempt to describe their life as net-negative. 

more then it's look like analysis, it's look like formalized failure in theory of mind, and of just... not knowing people with sucky lives. It's read as if the writers never talked to such person, or read their post in facebook, or read their books.

assume we are right and everyone else is wrong. then, we are right and everyone else is wrong. but why would I assume that? 

I find the luck of self awareness in this post really concerning and discrediting. 

I came to EA from the rationalshpere, and I find all the first part both untrue for me, and worrying. It's important part of my life to not be a person like that.  and indeed, I had instinctive (or maybe socialstinctive) opinion pro-socialism as a child, and I changed her when I encountered evidence. 

In a similar way, I just fail to imagine how can you believe you faction is the weakestin Starcraft. like, you already said it's not! It's obvious! I can understand aliefing it, but not believing it. and there is a difference, and big one.

and it's not even what happening in EA. what I see happening is people look on the evidence, and change their mind. (when there are factions of EA that behave in mindkilled way I find it deeply concerning.)

so, the way I observe EA work in practice, and way I expect it to work in this toy example, is the same  - EA will start with the believe that their favorite politic idea is most effective, then go and search for evidence, not find good enough evidence, and go for global health and not torturing animals and not destroying all humanity and maybe AI (this actually look like historical incident to me, but even that is contestable - It's not an accident that the same sort of people that interested in AI is interested in EA. there is a  thoughts generator that generate both).

I see this post as giving up on some really basic rationality skill, with the implicit claim it's impossible to do. when people in real life have this skill and use it all the time! 

so while I support tugging sideways, I find this post worrying. EA is based on having better judgment, not on giving up or claiming it's impossible to have better judgment. especially in a world where the possibility was proven again and again. be more ambitious! what you implicitly claim is impossible look like pretty basic skill for me, and one that really worth acquiring. ES are much worst then EA as it exist today.