DM

David Mathers

3528 karmaJoined Dec 2021

Comments
346

Scott seems not unsympathetic to something like* that step here**, though he stops short of clear endorsement: https://www.astralcodexten.com/p/book-review-the-origins-of-woke I think this is a dangerous path to go down. 

*"Something like"= if you substitute "all there is" with "a major cause, which makes some standard albeit controversial ways of targeting racial inequality fail a cost/benefit test that they might otherwise pass. 

**Full quote:
 'Everyone is so circumspect when talking about race that I can never figure out what anyone actually knows or believes. Still, I think most people would at least be aware of the following counterargument: suppose you’re the math department at a college. You might like to have the same percent black as the general population (13%). But far fewer than 13% (let’s say 2%) of good math PhDs are black. So it’s impossible for every math department to hire 13% black math professors unless they lower their standards or take some other drastic measure.

Okay, says our hypothetical opponent. Then that means math grad programs are discriminating against blacks. Fine, they’re the ones we should be investigating for civil rights violations.

No, say the math grad programs, fewer than 13% of our applicants are black too.

Fine, then the undergrad programs are the racists. Or if they can prove they’re not, then the high schools are racist and we should do busing. The point is, somebody somewhere along the line has to be racist, right?

I know of four common, non-exclusive answers to this question.

Yes, the high schools (or whatever) are racist. And if you can present a study proving that high schools aren’t racist, then it’s the elementary schools. And if you have a study there too, it’s the obstetricians, giving black mothers worse pregnancy care. If you have a study disproving that too, why are you collecting all these studies? Hey, maybe you’re the racist!

Maybe institutions aren’t too racist today, but there’s a lot of legacy of past racism, and that means black people are poor. And poor people have fewer opportunities and do worse in school. If you have a study showing that black people do worse even when controlled for income, then maybe it’s some other kind of capital, like educational capital or social capital. If you have studies about those too, see above.

Black people have a bad culture. Something something shoes and rap music, trying hard at school gets condemned as “acting white”. They should hold out for a better culture. I hear nobody’s using ancient Sumerian culture these days, maybe they can use that one.

White people have average IQ 100, black people have average IQ 85, this IQ difference accurately predicts the different proportions of whites and blacks in most areas, most IQ differences within race are genetic, maybe across-race ones are genetic too. I love Hitler and want to marry him.

None of these are great options, and I think most people work off some vague cloud of all of these and squirm if you try to make them get too specific. I don’t exactly blame Hanania for not taking a strong stand here. It’s just strange to assume civil rights law is bad and unnecessary without having any opinion on whether any of this is true, whether civil rights law is supposed to counterbalance it, and whether it counterbalances it a fair amount.

A cynic might notice that in February of this year, Hanania wrote Shut Up About Race And IQ. He says that the people who talk about option 4 are “wrong about fundamental questions regarding things like how people form their political opinions, what makes for successful movements, and even their own motivations.” A careful reader might notice what he doesn’t describe them as being wrong about. The rest of the piece almost-but-not-quite-explicitly clarifies his position: I read him as saying that race realism is most likely true, but you shouldn’t talk about it, because it scares people.

(*I’m generally against “calling people out” for believing in race realism*. I think people should be allowed to hide beliefs that they’d get punished for not hiding. I sympathize with some of these positions and place medium probability on some weak forms of them. I think Hanania is open enough about where he’s coming from that this review doesn’t count as a callout.)'

'

The race stuff is much more right-coded than some of the other genetic/disability stuff. 

I think he is spreading the view because he strategizes about doing so in the quoted email (though it's a bit hard to specify what the view is, since it's not clear what probability "probably" amounts to.) 

 

You are a polite and careful critic, I think you will not get a mega-hostile reaction from most people. (If the worry is just that you won't persuade, then, well, you're not making things worse.)


Do we actually have hard statistical evidence that rationalists as a group "lean right"? I am highly unsympathetic to right rationalism, as you can see here: https://forum.effectivealtruism.org/posts/kgBBzwdtGd4PHmRfs/an-instance-of-white-supremacist-and-nazi-ideology-creeping?commentId=tNHd9C8ZbazepnDqs  And it certainly feels true emotionally to me that "rationalism is right-wing". (Which is one reason I consider myself an EA but not a rationalist, although that is mostly just because I entered EA through academic philosophy not rationalism and other than reading a lot of SSC/ACX over the years, have only ever interacted with rationalists in the context of doing EA stuff.) Certain high profile individual rationalist seem to hold a lot of taboo/far-right beliefs (i.e. Scott Alexander on race and IQ here: https://www.astralcodexten.com/p/book-review-the-origins-of-woke). Roko and Hannania are of course even more right-wing (and frankly pretty gross in my view), though hopefully they are outliers. 

BUT

Over the years, I have observed a general pattern with, what we can call "rationalist-like" groups: i.e. lots of men, mostly straight and white, lots of autism broadly construed, an interest in telling the harsh truth, reverence for STEM and skepticism of the humanities, more self-declared right libertarians than the population average etc.:

1) The group gains a reputation for being right-wing, sexist, bigoted etc. 

2) People in the group get very offended about this; I get a bit offended too: most people I have met within the group seem moderate, with a lean towards the centre-left rather than the centre-right. I feel as a person with mild autism that autistic truth-telling and bluntness is getting stigmatised by annoying, overemotive people who can't defend their views in a fair argument. 

3) Gradually thought leaders of the group have a lot of scandals involving some combination of: misogyny, sexual harassment, Islamophobia, racism, eugenics, Western chauvinism etc.

4) I start thinking "ok, maybe [group] actually is right-wing, and I am either a sucker to be involved with it, or self-deceived about my own (self-declared liberal centrist) political preferences (after all, I do get irritated with the left a lot and believe some un-PC things, I think I am pro some transhumanist genetic engineering stuff in principle maybe, am not particularly left-or right-economically, maybe "liberalism" is what angry Marxists on twitter say it is etc. etc.)"

5) Some survey data comes out about the political views of rank-and-file members of [group]. They are overwhelmingly centre-left liberal: Where there is evidence on views about gender specifically, they are also pretty centre-left liberal. I feel even more confused.


Over the years I have seen this pattern to varying degrees with: 

-Movement atheism (Can't find the survey data I once saw highlighted on twitter on this by a surprised critic of right-wing movement "skepticism" so you'll have to trust me on this one.) 

-Analytic philosophy (relatively speaking: seen as a "right-wing" subject in the humanities in relative terms, and a bastion of sexism: both those might be true, but nonetheless,  considerably more analytic philosophers endorse socialism than capitalism, and a slight majority are socialists: https://survey2020.philpeople.org/survey/results/5122)

-"Tech" itself: https://www.noahpinion.blog/p/silicon-valley-isnt-full-of-fascists

-EA itself (Compare our bad reputation on the left-as far as I can tell, with the fact that more EAs identify as "left" than "centre" "centre-right" "right" "libertarian" or "other" put together even when "centre-left" is also an option: https://forum.effectivealtruism.org/posts/AJDgnPXqZ48eSCjEQ/ea-survey-2022-demographics#Politics). 

I have seen less hard data for the rationalists, but I do recall about ten years ago Scott Alexander trumpeting that the average LessWrong user had at least as  positive a rating of "feminism" on a 1-5 scale as the average American woman. (Though the median American woman politically is probably like an elderly Latina church goer with economically left-wing socially conservative Catholic views?) And that whilst survey data of SSC readers at one point showed most endorsed "race realism" (I remember David Thorstad pointing this out on twitter), and I would not hesitate to describe ACX as "linked to the far-right", nonetheless I seem to remember than when Scott surveyed the readers on a 1-10 left-right scale, the median reader was a 4.something, i.e. very slightly more left than right identified:

I am not sure what is going on with this, probably a mixture of:

-People being self-deceived about their views and being more right-wing than they think they are, because the right is stigmatized in wider intellectual culture and people don't want to see themselves as part of it. 

-People in these spaces hold mostly left views, but they mostly hold relatively uncontroversial left-wing views, or are not a prime target for the right-wing press for other reasons, whilst they minority of right-wing views they do hold tend to be radioactively controversial so they end up in the media.

-I mostly read centre-leftish media (The Guardian, Yglesias, Vox until the last couple of years) or critics of "wokeness" who are not straightforwardly conservative (Yglesias again, Singal), rather than conventional conservative stuff, so I hear about "woke"/left anger with these groups, but not right-wing anger with them. I also pay less attention to the latter because I just care less about it; it's not a source of personal angst for me in the same way. 

-People who want to/get to become leaders in these sorts of spaces differ in their traits from the median member of the group in ways that make them predictably more right-wing than the average.

-*Becoming* a leader makes you more right-wing, since you like hierarchy more when you're on top of the local hierarchy. 

-People confused a (perceived and/or real) tendency towards sexual bad behaviour amongst autistic nerds with a right-wing political position. 

-These groups are well to the left of the median citizens, but they are to the right of the median person with a master degree, so most people in "intellectual" spheres are correctly picking up on them being more right-wing than them and they're friends, but wrongly concluding that makes them "right-wing" by the standards of the public as a whole.

-Anything stereotypically "masculine" outside of a strike by manual labourers gets coded as "right" these days, facts be damned. 

-There is a distinctive cluster of issues around "biodeterminism" on which these groups are very, very right-wing on average-eugenics, biological race and gender differences etc.-but on everything else they are centre-left. 

It only definitely follows from humans being net negative in expectation that we should try to make humans go extinct if you are both a full utilitarian and  "naive" about it, i.e. prepared to break usually sacrosanct moral rules when you personally judge that to be likely to have the best consequences, something which most utilitarians take to be likely to usually result in bad consequences and therefore to be discouraged.  Another way to describe 'make humanity more  likely to go extinct' is 'murder more people than all the worst dictators in history combined'. That is the sort of thing that is going to be look like a prime candidate for "do not do this, even if it has the best consequences' on non-utilitarian moral views. And it's also obviously breaking standard moral rules. 

I should probably stop posting on this or reading the comments, for the sake of my mental health (I mean that literally, this is a major anxiety disorder trigger for me.) But I guess I sort of have to respond to a direct request for sources. 

 

Scott's official position on this is agnosticism, rather than public endorsement*. (See here for official agnosticism: https://www.astralcodexten.com/p/book-review-the-cult-of-smart)

However, for years at SSC he put the dreaded neo-reactionaries on his blogroll. And they are definitely race/IQ guys. Meanwhile, he was telling friends privately at the time, that "HBD" (i.e. "human biodiversity", but generally includes the idea that black people are genetically less intelligent) is "probably partially correct or at least very non-provably non-correct": https://twitter.com/ArsonAtDennys/status/1362153191102677001 . That is technically still leaving some room for agnosticism, but it's pretty clear which way he's leaning. Meanwhile, he also was saying in private not to tell anyone he thinks this (I feel like I figured out his view was something like this anyway though? Maybe that's hindsight bias): 'NEVER TELL ANYONE I SAID THIS, not even in confidence'. And he was also talking about how publicly declaring himself to be a reactionary was bad strategy for PR reasons ("becoming a reactionary would be both stupid and decrease my ability to spread things to non-reactionary readers"). (He also discusses how he writes about this stuff partly because it drives blog traffic. Not shameful in itself, but I think people in EA sometimes have an exaggerated sense of Scott's moral purity and integrity that this sits a little awkwardly with.) Overall, I think his private talk on this paints a picture of someone who is too cautious to be 100% sure that Black people have genetically lower IQs, but wants other people to increase their credence in that to >50%, and is thinking strategically (and arguably manipulatively) about how to get them to do so. (He does seem to more clearly reject the anti-democratic and the most anti-feminist parts of Neo-Reaction.) 

I will say that MOST of what makes me angry about this, is not the object-level race/IQ beliefs themselves, but the lack of repulsion towards the Reactionaries as a  (fascist) political movement. I really feel like this is pretty damning (though obviously Scott has his good traits too). The Reactionaries are known for things like trolling about how maybe slavery was actually kind of good: https://www.unqualified-reservations.org/2009/07/why-carlyle-matters/  Scott has never seemed sufficiently creeped out by this (or really, at all creeped out by it in my experience). But he has been happy to get really, really angry about feminists who say mean things about nerds**, or in one case I remember, stupid woke changes to competitive debate. (I couldn't find that one by googling, so you'll have to trust my memory about it; they were stupid, just not worth the emotional investment.) Personally, I think fascism should be more upsetting than woke debate! (Yes, that is melodramatic phrasing, but I am trying to shock people out what I think is complacency on this topic.) 

I think people in EA have a big blind-spot about Scott's fairly egregious record on this stuff, because it's really embarrassing for the community to admit how bad it is, people (including me often; I feel like I morally ought to give up ACX, but I still check it from time to time) like his writing for other reasons. And frankly, there is also a certain amount of (small-r) reactionary white male backlash in the community. Indeed, I used to enjoy some of Scott's attacks on wokeness myself; I have similar self-esteem issues around autistic masculinity issues as I think many anti-woke rationalists do. The currently strongly negative position is one I've come to slowly over many years of thinking about this stuff, though I was always uncomfortable with his attitude towards the Reactionaries. 



*[Quoting Scott] 'Earlier this week, I objected when a journalist dishonestly spliced my words to imply I supported Charles Murray's The Bell Curve. Some people wrote me to complain that I handled this in a cowardly way - I showed that the specific thing the journalist quoted wasn’t a reference to The Bell Curve, but I never answered the broader question of what I thought of the book. They demanded I come out and give my opinion openly. Well, the most direct answer is that I've never read it. But that's kind of cowardly too - I've read papers and articles making what I assume is the same case. So what do I think of them?

This is far enough from my field that I would usually defer to expert consensus, but all the studies I can find which try to assess expert consensus seem crazy. A while ago, I freaked out upon finding a study that seemed to show most expert scientists in the field agreed with Murray's thesis in 1987 - about three times as many said the gap was due to a combination of genetics and environment as said it was just environment. Then I freaked out again when I found another study (here is the most recent version, from 2020) showing basically the same thing (about four times as many say it’s a combination of genetics and environment compared to just environment). I can't find any expert surveys giving the expected result that they all agree this is dumb and definitely 100% environment and we can move on (I'd be very relieved if anybody could find those, or if they could explain why the ones I found were fake studies or fake experts or a biased sample, or explain how I'm misreading them or that they otherwise shouldn't be trusted. If you have thoughts on this, please send me an email). I've vacillated back and forth on how to think about this question so many times, and right now my personal probability estimate is "I am still freaking out about this, go away go away go away". And I understand I have at least two potentially irresolvable biases on this question: one, I'm a white person in a country with a long history of promoting white supremacy; and two, if I lean in favor then everyone will hate me, and use it as a bludgeon against anyone I have ever associated with, and I will die alone in a ditch and maybe deserve it. So the best I can do is try to route around this issue when considering important questions. This is sometimes hard, but the basic principle is that I'm far less sure of any of it than I am sure that all human beings are morally equal and deserve to have a good life and get treated with respect regardless of academic achievement.

(Hopefully I’ve given people enough ammunition against me that they won’t have to use hallucinatory ammunition in the future. If you target me based on this, please remember that it’s entirely a me problem and other people tangentially linked to me are not at fault.)'

** Personally I hate *some* of the shit he complains about there too, although in other cases I probably agree with the angry feminist takes and might even sometimes defend the way they are expressed. I am autistic and have had great difficulties attracting romantic interest. (And obviously, as my name indicates I am male. And straight as it happens.) But Scott's two most extensive blogposts on this are incredibly bare of sympathetic discussion of why feminists might sometimes be a bit angry and insensitive on this issue. 

I think this is too pessimistic:  why did one of Biden's cabinet ask for Christiano in one of the top positions at the US gov's AI safety org if the government will reliably prioritize the sort of factors you cite here to the exclusion of safety?: https://www.nist.gov/news-events/news/2024/04/us-commerce-secretary-gina-raimondo-announces-expansion-us-ai-safety

I also think that whether or not the government regulates private AI has little to do with whether it militarizes AI. It's not like there is one dial with "amount of government" and it just gets turned up or down. Government can do very little to restrict what Open AI/DeepMind/Anthropic do, but then also spend lots and lots of money on military AI projects. So worries about militarization are not really a reason not to want the government to restrict Open AI/DeepMind/Anthropic.

Not to mention that insofar as the basic science here is getting done for commercial reasons, any regulations which slow down the commercial development of frontier modes will actually slow down the progress of AI for military applications too, whether or not that is what the US gov intends, and regardless of whether those regulations are intended to reduce X-risk, or to protect the jobs of voice actors in cartoons facing AI replacement. 

I trust EV more than the charity commission about many things, but whether EV behaved badly over SBF is definitely not one of them. One judgment here is incredibly liable to distortion through self-interest and ego preservation, and it's not the charity commission's. (That's not a prediction that the charity commission will in fact harshly criticize EV. I wouldn't be surprised either way on that.) 

'also on not "some moral view we've never thought of".'

Oh, actually, that's right. That does change things a bit. 

Load more