(Posting on behalf of and with permission from Duncan Sabien; the first person speaker is Duncan. Full text below.)

Just took the Effective Altruism survey, and it had an extra, optional section that had a lot of questions about the FTX stuff, and trust, and how EA should respond, and what I think of it, and so forth.

I'm not really an EA; haven't taken the pledge, don't work at an org, have been to fewer than twenty EA meetups in my life (though I've been a speaker at multiple EA Globals).

However, I've been close to, and quite fond of, and at least a little protective of the EA community, for the past seven years (for instance, volunteering to speak at multiple EA Globals!).

And the questions on the survey made me want to note publicly:

  • I feel almost exactly as fond of, and approving of, and protective of, the EA community writ-large as I have for the past seven years.
  • I get that the FTX thing is a big deal. Every EA I've seen has treated it as a big deal. It's being taken seriously, and I've seen soul-searching at every level from the individual up to the meta-organizational.

I think there's a mistake that the average person tends to make, though, which is something like "if a plane crashes, something absolutely needs to visibly change."

Often, when a plane crashes, something absolutely needs to change! Often there are legitimate flaws in the system that need to be patched.

But sometimes, you just get that confluence of three one-in-a-thousand events. Sometimes, the right answer really is "our system shouldn't change; this is the exception."

I'm not claiming that's the case here, at all. I'm mentioning it because:

  1. it is in fact sometimes true, and the outrage machine doesn't take that fact into account
  2. it is a good reminder of the difference between improvement and improvement theater

There are actions you can take to look like you're taking things seriously, and there are actions you can take because you're actually taking things seriously.

Public relations are also important, so some amount of the former category belongs in the latter category.

But overall, I'm not interested in insisting that the individuals and organizations in the Effective Altruism sphere do things that look to me, from the outside, like sufficient due diligence, in response to this crisis.

I want them to simply respond. To the best of their ability, in the ways that seem right to them.

And I do, in fact, trust that that is happening. In part because of the glimpses I've caught, but also in part because that's just ... firmly in my model of these people and these orgs.

Or to put it another way: the FTX thing was a blindside, and a negative update, but it was a negative update in the slack. It was the kind of out-of-distribution, surprisingly bad event that I sort of ... budget room for, on the meta level?

I expect there to be one or two bad things here and there, when you're trying to coordinate thousands and thousands of individuals all pulling in different directions. This particular badness was in an extremely high-leverage place, and that sucks, but that feels more like bad luck than like ... "How dare you all not have predicted this sort of thing and also been fully robust against any such shenaniganery at all times and from all angles!"

Some people are shouting stuff like that. But I think those people are being unreasonable, and not considering what a world with that level of vigilance actually looks like, in practice (hint: it looks like paralysis).

I think that if a second disaster falls (or if a second looming disaster is uncovered and prevented before it actually breaks) then sure, I will owe those people an apology, and will make a post much like this one admitting that I was wrong.

But in the meantime: what the sprawling and diverse EA community has "lost," in my books, was something like ... its one freebie?

I think it earned that one freebie from me, and I am not at all loath to say "En masse, I continue to trust and support and endorse these people to basically the same degree that I did before the black swan event."

Sometimes, planes crash. It's not my job to second-guess the post-disaster cleanup, and I think that giving up on EA as a result of the FTX thing is largely analogous to giving up on planes because of 9/11.

Comments5
Sorted by Click to highlight new comments since:

To use your plane analogy, there have been 3 planes (billionaire donors) and 2 have crashed. I don’t know exactly what to do to solve the problem, but I do think that EA needs to be more open to external pragmatism.

I upvoted this because your plane analogy is fantastic, and epistemic-downvoted this because "EA needs to be more open to external pragmatism" could mean a lot of things, including the obvious "EA needs to get better at unknown unknowns and people who understand them" but simultaneously also a dog whistle for "underdogs like me should be in charge instead" or "EA should be more in-line with status quo ideologies that already have 100 million people".

I also do weasel words a lot, so I know what I'm talking about.

It’s fair to criticise the weasel words. What I meant by external pragmatism was more around operations than cause prioritisation, e.g. we should learn the lessons of decades of governance, managerial practices and evidence of how to actually get things done

There are more than 3 billionaire donors.

I agree with this post.

Curated and popular this week
Relevant opportunities