RN

richard_ngo

7503 karmaJoined

Bio

Former AI safety research engineer, now AI governance researcher at OpenAI. Blog: thinkingcomplete.blogspot.com

Sequences
2

Replacing Fear
EA Archives Reading List

Comments
331

Thanks for sharing this, it does seem good to have transparency into this stuff.

My gut reaction was "huh, I'm surprised about how large a proportion of these people (maybe 30-50%, depending on how you count it) I don't recall substantially interacting with" (where by "interaction" I include reading their writings).

To be clear, I'm not trying to imply that it should be higher; that any particular mistakes are being made; or that these people should have interacted with me. It just felt surprising (given how long I've been floating around EA) and worth noting as a datapoint. (Though one reason to take this with a grain of salt is that I do forget names and faces pretty easily.)

My point is not that the current EA forum would censor topics that were actually important early EA conversations, because EAs have now been selected for being willing to discuss those topics. My point is that the current forum might censor topics that would be important course-corrections, just as if the rest of society had been moderating early EA conversations, those conversations might have lost important contributions like impartiality between species (controversial: you're saying human lives don't matter very much!), the ineffectiveness of development aid (controversial: you're attacking powerful organizations!), transhumanism (controversial, according to the people who say it's basically eugenics), etc.

Re "conversations can be had in more sensitive ways", I mostly disagree, because of the considerations laid out here: the people who are good at discussing topics sensitively are mostly not the ones who are good at coming up with important novel ideas.

For example, it seems plausible that genetic engineering for human intelligence enhancement is an important and highly neglected intervention. But you had to be pretty disagreeable to bring it into the public conversation a few years ago (I think it's now a bit more mainstream).

Narrowing in even further on the example you gave, as an illustration: I just had an uncomfortable conversation about age of consent laws literally yesterday with an old friend of mine. Specifically, my friend was advocating that the most important driver of crime is poverty, and I was arguing that it's cultural acceptance of crime. I pointed to age of consent laws varying widely across different countries as evidence that there are some cultures which accept behavior that most westerners think of as deeply immoral (and indeed criminal).

Picturing some responses you might give to this:

  1. That's not the sort of uncomfortable claim you're worried about
    1. But many possible continuations of this conversation would in fact have gotten into more controversial territory. E.g. maybe a cultural relativist would defend those other countries having lower age of consent laws. I find cultural relativism kinda crazy (for this and related reasons) but it's a pretty mainstream position.
  2. I could have made the point in more sensitive ways
    1. Maybe? But the whole point of the conversation was about ways in which some cultures are better than others. This is inherently going to be a sensitive claim, and it's hard to think of examples that are compelling without being controversial.
  3. This is not the sort of thing people should be discussing on the forum
    1. But EA as a movement is interested in things like:

      1. Criminal justice reform (which OpenPhil has spent many tens of millions of dollars on)
      2. Promoting women's rights (especially in the context of global health and extreme poverty reduction)
      3. What factors make what types of foreign aid more or less effective
      4. More generally, the relationship between the developed and the developing world

      So this sort of debate does seem pretty relevant.

I think EA would've broadly survived intact by lightly moderating other kinds of discomfort (or it may have even expanded).

The important point is that we didn't know in advance which kinds of discomfort were of crucial importance. The relevant baseline here is not early EAs moderating ourselves, it's something like "the rest of academic philosophy/society at large moderating EA", which seems much more likely to have stifled early EA's ability to identify important issues and interventions.

(I also think we've ended up at some of the wrong points on some of these issues, but that's a longer debate.)

Ty for the reply; a jumble of responses below.

I think there are better places to have these often awkward, fraught conversations.

You are literally talking about the sort of conversations that created EA. If people don't have these conversations on the forum (the single best way to create common knowledge in the EA commmunity), then it will be much harder to course-correct places where fundamental ideas are mistaken. I think your comment proceeds from the implicit assumption that we're broadly right about stuff, and mostly just need to keep our heads down and do the work. I personally think that a version of EA that doesn't have the ability to course-correct in big ways would be net negative for the world. In general it is not possible to e.g. identify ongoing moral catastrophes when you're optimizing your main venue of conversations for avoiding seeming weird.

I agree with you the quote from the Hamas charter is more dangerous - and think we shouldn't be publishing or discussing that on the forum either.

If you're not able to talk about evil people and their ideologies, then you will not be able to account for them in reasoning about how to steer the world. I think EA is already far too naive about how power dynamics work at large scales, given how much influence we're wielding; this makes it worse.

There's potential reputational damage for all the people doing great EA work across the spectrum here.

I think there are just a few particular topics which give people more ammunition for public take-downs, and there is wisdom in sometimes avoiding loading balls into your opponents cannons.

Insofar as you're thinking about this as a question of coalitional politics, I can phrase it in those terms too: the more censorious EA becomes, the more truth-seeking people will disaffiliate from it. Habryka, who was one of the most truth-seeking people involved in EA, has already done so; I wouldn't say it was directly because of EA not being truth-seeking enough, but I think that was one big issue for him amongst a cluster of related issues. I don't currently plan to, but I've considered the possibility, and the quality of EA's epistemic norms is one of my major considerations (of course, the forum's norms are only a small part of that).

However, having said this, I don't think you should support more open forum norms mostly as a concession to people like me, but rather in order to pursue your own goals more effectively. Movements that aren't able to challenge foundational assumptions end up like environmentalists: actively harming the causes they're trying to support.

I appreciate the thought that went into this. I also think that using rate-limits as a tool, instead of bans, is in general a good idea. I continue to strongly disagree with the decisions on a few points:

  1. I still think including the "materials that may be easily perceived as such" clause has a chilling effect.
  2. I also remember someone's comment that the things you're calling "norms" are actually rules, and it's a little disingenuous to not call them that; I continue to agree with this.
  3. The fact that you're not even willing to quote the parts of the post that were objectionable feels like an indication of a mindset that I really disagree with. It's like... treating words as inherently dangerous? Not thinking at all about the use-mention distinction? I mean, here's a quote from the Hamas charter: "There is no solution for the Palestinian question except through Jihad." Clearly this is way way more of an incitement to violence than any quote of dstudiocode's, which you're apparently not willing to quote. (I am deliberately not expressing any opinion about whether the Hamas quote is correct; I'm just quoting them.) What's the difference?
  4. "They see the fact that it is “just” a philosophical question as not changing the assessment." Okay, let me now quote Singer. "Human babies are not born self-aware, or capable of grasping that they exist over time. They are not persons... the life of a newborn is of less value than the life of a pig, a dog, or a chimpanzee.” Will you warn/ban me from the EA forum for quoting Singer, without endorsing that statement? What if I asked, philosophically, "If Singer were right, would it be morally acceptable to kill a baby to save a dog's life?" I mean, there are whole subfields of ethics based on asking about who you would kill in order to save whom (which is why I'm pushing on this so strongly: the thing you are banning from the forum is one of the key ways people have had philosophical debates over foundational EA ideas). What if I defended Singer's argument in a post of my own?

As I say this, I feel some kind of twinge of concern that people will find this and use it to attack me, or that crazy people will act badly inspired by my questions. I hypothesize that the moderators are feeling this kind of twinge more generally. I think this is the sort of twinge that should and must be overridden, because listening to it means that your discourse will forever be at the mercy of whoever is most hostile to you, or whoever is craziest. You can't figure out true things in that situation.

(On a personal level, I apologize to the moderators for putting them in difficult situations by saying things that are deliberately in the grey areas of their moderation policy. Nevertheless I think it's important enough that I will continue doing this. EA is not just a group of nerds on the internet any more, it's a force that shapes the world in a bunch of ways, and so it is crucial that we don't echo-chamber ourselves into doing crazy stuff (including, or especially, when the crazy stuff matches mainstream consensus). If you would like to warn/ban me, then I would harbor no personal ill-will about it, though of course I will consider that evidence that I and others should be much more wary about the quality of discourse on the forum.)

This moderation policy seems absurd. The post in question was clearly asking purely hypothetical questions, and wasn't even advocating for any particular answer to the question. May as well ban users for asking whether it's moral to push a man off a bridge to stop a trolley, or ban Peter Singer for his thought experiments about infanticide.

Perhaps dstudiocode has misbehaved in other ways, but this announcement focuses on something that should be clearly within the bounds of acceptable discourse. (In particular, the standard of "content that could be interpreted as X" is a very censorious one, since you now need to cater to a wide range of possible interpretations.)

I accept that I should talk about "Trump and the Republican party". But conversely, when we talk about the Democratic party, we should also include the institutions over which it has disproportionate influence—including most mainstream media outlets, the FBI (which pushed for censorship of one of the biggest anti-Biden stories in the lead-up to the 2020 election—EDIT: I no longer endorse this phrasing, it seems like the FBI's conversations with tech companies were fairly vague on this matter), the teams responsible for censorship at most major tech companies, the wide range of agencies that started regulatory harassment of Elon under the Biden administration, etc.

If Trump had anywhere near the level of influence over elite institutions that the Democrats do, then I'd agree that he'd be clearly more dangerous.

One more point: in Scott's blog post he talks about the "big lie" of Trump: that the election was stolen. I do worry that this is a key point of polarization, where either you fully believe that the election was stolen and the Democrats are evil, or you fully believe that Trump was trying to seize dictatorial power.

But reality is often much more complicated. My current best guess is that there wasn't any centrally-coordinated plan to steal the election, but that the central Democrat party:

  1. Systematically turned a blind eye to thousands of people who shouldn't have been voting (like illegal immigrants) actually voting (in some cases because Democrat voter registration pushes deliberately didn't track this distinction).
  2. Blocked reasonable election integrity measures that would have prevented this (like voter ID), primarily in a cynical + self-interested way.

On priors I think this probably didn't swing the election, but given how small the winning margins were in swing states, it wouldn't be crazy if it did. From this perspective I think it reflects badly on Trump that he tried to do unconstitutional things to stay in power, but not nearly as badly as most Democrats think.

(Some intuitions informing this position: I think if there had been clear smoking guns of centrally-coordinated election fraud, then Trump would have won some of his legal challenges, and we'd have found out about it since then. But it does seem like a bunch of non-citizens are registered to vote in various states (e.g. here, here), and I don't think this is a coincidence given that it's so beneficial for Dems + Dems have so consistently blocked voter ID laws. Conversely, I do also expect that red states are being overzealous in removing people from voter rolls for things like changing their address. Basically it all seems like a shitshow, and not one which looks great for Trump, but not disqualifying either IMO, especially because in general I expect to update away from the mainstream media line over time as information they've suppressed comes to light.)

I think this pales in comparison to Trump’s willingness to silence critics (e.g. via hush money and threats).

If you believe that Trump has done a bunch of things wrong, the Democrats have done very little wrong, and the people prosecuting Trump are just following normal process in doing so, then yes these threats are worrying.

But if you believe that the charges against Trump were in fact trumped-up, e.g. because Democrats have done similarly bad things without being charged, then most of Trump's statements look reasonable. E.g. this testimony about Biden seems pretty concerning—and given that context, saying "appoint a Special Counsel to investigate Joe Biden who hates Biden as much as Jack Smith hates me” seems totally proportional.

Also, assuming the "hush money" thing is a reference to Stormy Daniels, I think that case reflects much worse on the Democrats than it does on Trump—the "crime" involved is marginal or perhaps not even a crime at all. (tl;dr: Paying hush money is totally legal, so the actual accusation they used was "falsifying business records". But this by itself would only be a misdemeanor, unless it was done to cover up another crime, and even the prosecution wasn't clear on what the other crime actually was.) Even if it technically stands up, you can imagine the reaction if Clinton was prosecuted on such flimsy grounds while Trump was president.

The Democratic party, like the GOP, is going to act in ways which help get their candidate elected. ... There’s nothing illegal about [not hosting a primary] though, parties are private entities and can do whatever they want to select a candidate.

If that includes suing other candidates to get them off the ballots, then I'm happy to call that unusually undemocratic. More generally, democracy is constituted not just by a set of laws, but by a set of traditions and norms. Not hosting a primary, ousting Biden, Kamala refusing interviews, etc, all undermine democratic norms.

Now, I do think Trump undermines a lot of democratic norms too. So it's really more of a question of who will do more damage. I think that many US institutions (including the media, various three-letter agencies, etc) push back strongly against Trump's norm-breaking, but overlook or even enable Democrat norm-breaking—for instance, keeping Biden's mental state secret for several years. Because of this I am roughly equally worried about both.

Scott Aaronson lays out some general concerns well here.

I don't really see much substance here. E.g. Aaronson says "Trump’s values, such as they are, would seem to be “America First,” protectionism, vengeance, humiliation of enemies, winning at all costs, authoritarianism, the veneration of foreign autocrats, and the veneration of himself." I think America First is a very reasonable value for an American president to have (and one which is necessary for the "American-led peaceful world order" that Scott wants). Re protectionism, seems probably bad in economic terms, but much less bad than many Democrat policies (e.g. taxing unrealized capital gains, anti-nuclear, etc). Re "vengeance, humiliation of enemies, winning at all costs, authoritarianism": these are precisely the things I'm concerned about from the Democrats. Re "the veneration of foreign autocrats": see my comments on Trump's foreign policy.

I don't think the link you provided on Reddit censorship demonstrates censorship

Sorry, I'd linked it on memory since I've seen a bunch of censorship examples from them, but I'd forgotten that they also post a bunch of other non-censorship stuff. Will dig out some of the specific examples I'm thinking about later.

Re Facebook, here's Zuckerberg's admission that the Biden administration "repeatedly pressured our teams for months" to censor covid-related content (he also mentions an FBI warning about Russian disinformation in relation to censorship of the Hunter Biden story, though the specific link is unclear).

(This comment focuses on object-level arguments about Trump vs Kamala; I left another comment focused on meta-level considerations.)

Three broad arguments for why it's plausibly better if Trump wins than if Kamala does:

  1. I basically see this election as a choice between a man who's willing to subvert democracy, and a party that is willing to subvert democracy—e.g. via massively biased media coverage, lawfare against opponents, and coordinated social media censorship (I've seen particularly egregious examples on Reddit, but I expect that Facebook and Instagram are just as bad). RFK Jr, a lifelong Democrat (and a Kennedy to boot), has now endorsed Trump because he considers Democrat behavior too undemocratic. Heck, even Jill Stein has make this same critique. It's reasonable to think that the risk Trump poses outweighs that, but it's also reasonable to lean the other way, especially if you think (like I do) that the neutrality + independence of many US institutions is at a low point (e.g. see the Biden administration's regulatory harassment of Musk on some pretty ridiculous grounds).
  2. On foreign policy: it seems like Trump was surprisingly prescient about several major geopolitical issues (e.g. his 2016 positions that the US should be more worried about China, and that the US should push European countries to contribute much more to NATO, were heavily criticized at the time, but now are mainstream). The Abraham Accords also seem pretty significant. And I think the fact that the Ukraine war and the Gaza war both broke out under Biden not Trump should make us update in Trump's favor (though I'm open to arguments on how much we should update).
  3. On AI and pandemics: I don't like his object-level policies but I do think he'll bring in some very competent people (like Musk and Ramaswamy), and as I argued in this post I think the EA community tends to err towards favoring people who agree with our current beliefs, and should update towards prioritizing competence. (Of course there are also some very competent people on the Democrat side on these issues, but I expect them to be more beholden to the status quo. So if e.g. you think that FDA reform is important for biosecurity, that's probably easier under Trump than Harris.)
Load more