E

Elityre

468 karmaJoined

Comments
45

This post is not (mainly) calling out EA and EAs for wanting to accelerate AI.

It's calling out those of us who do think that the AGI labs are developing a technology that will literally kill us and destroy everything we love with double digit probability, but are still friendly with the labs and people who work at the labs.

And it's calling out those people who think the above, and take a salary from the AGI labs anyway.

I read this post as saying something like, 

If you're serious about what you believe, and you had very basic levels of courage, you would never go to a party with someone who was working at Anthropic and not directly tell them that what they're doing is bad and they should stop."

Yes, that's awkward. Yes, that's confrontational. 

But if you go to a party with people building a machine that you think will kill everyone, and you just politely talk with them about other stuff, or politely ignore them,  then you are a coward and an enabler and a hypocrite. 

Your interest in being friendly with people in your social sphere, over and above vocally opposing the creation of a doom-machine is immoral and disgraceful to the values you claim to hold.

I (Holly) am drawing the line here. Don't expect me me to give polite respect to what I consider the ludicrous view that it's reasonable to eg work for Anthropic. 

I don't overall agree with this take, at this time. But I'm not very confident in my disagreement. I think Holly might basically be right here, and on further reflection I might come to agree with her.

I definitely agree that the major reason why there's not more vocal opposition to working at an AGI lab is social conformity and fear of social risk. (Plus most of us are not well equipped to evaluated whether it possibly makes sense to try to "make things better from the inside", and so we defer to others who are broadly pro some version of that plan.)

 

He recently made this comment on LessWrong, which expresses some of his views on the harm that OP causes.

@Kat Woods 

I'm trying to piece together a timeline of events. 

You say in the evidence doc that

days after starting at Nonlinear, Alice left to spend a whole month with her family. We even paid her for 3 of the 4 weeks despite her not doing much work. (To be fair, she was sick.)

Can you tell me what month this was? Does this mean just after she quit her previous job or just after she started traveling with you?

FWIW, that was not obvious to me on first reading, until the comments pointed it out to me.

Mostly I find it ironic, given that Ben says his original post was motivated by a sense that there was a pervasive silencing effect, where people felt unwilling to share their negative experiences with Nonlinear for fear of reprisal.

Why might humans evolve a rejection of things that taste to sweet? What fitness reducing thing does "eating oversweet things" correlate with? Or is it a spandrel of something else?

If this is true, it's fascinating, because it suggest that our preference for cold and carbonation are a kind of specification gaming!

 

Ok. Given all that, is there particular thing that you wish Ben (or someone) had done differently here? Or are you mostly wanting to point out the dynamic?

Elityre
83
6
0
3
9

I want to try to paraphrase what I hear you saying in this comment thread, Holly. Please feel free to correct any mistakes or misframings in my paraphrase.

I hear you saying...

  • Lightcone culture has a relatively specific morality around integrity and transparency. Those norms are consistent, and maybe good, but they're not necessarily shared by the EA community or the broader world.
  • Under those norms, actions like threatening your ex-employees's carrer prospects to prevent them from sharing negative info about you are very bad, while in broader culture a "you don't badmouth me, I don't badmouth you" ceasefire is pretty normal.
  • In this post, Ben is accusing Nonlinear of bad behavior. In particular, he's accusing them of acting particularly badly (compared to some baseline of EA orgs) according to the integrity norms of lightcone culture. 
    • My understanding is that the dynamic here that Ben considers particularly egregious is that  Nonlinear allegedly took actions to silence their ex-employees, and prevent negative info from propagating. If all of the same events had occurred between Nonlinear, Alice, and Chloe, except for Nonlinear suppressing info about what happened after the fact, Ben would not have prioritized this.
  • However, many bystanders are likely to miss that subtlety. They see Nonlinear being accused, but don't share Lightcone's specific norms and culture. 
  • So many readers, tracking the social momentum, walk away with the low-dimensional bottom line conclusion "Boo Nonlinear!", but without particularly tracking Ben's cruxes.
    • eg They have the takeaway "it's irresponsible to date or live with your coworkers, and only irresponsible people do that" instead of "Some people in the ecosystem hold that suppressing negative info about your org is a major violation."
  • And importantly, it means in practice, Nonlinear is getting unfairly punished for some behaviors that are actually quite common in the EA subculture. 
  • This creates a dynamic analogous to "There are so many laws on the book that technically everyone is a criminal. So the police/government can harass or imprison anyone they choose, by selectively punishing crimes." If enough social momentum gets mounted against an org, they can be lambasted for things that many orgs are "guilty" of[1], while the other orgs get off scott free.
  • And furthermore, this creates unpredictability. People can't tell whether their version of some behavior is objectionable or not. 
  • So overall, Ben might be accusing Nonlinear for principled reasons, but to many bystanders, this is indistinguishable from accusing them for pretty common EA behaviors, by fiat. Which is a pretty scary precedent!

Am I understanding correctly?

  1. ^

    "guilty" in quotes to suggest the ambiguity about whether the behaviors in question are actually bad or guiltworthy.

Load more