I do see a significant moral difference between allowing people to make potentially risky decisions and deceiving them about how much risk is involved. As an exchange, FTX was theoretically just serving to coordinate buyers and sellers who wanted to transact in the first place. If you believe that at least a portion of crypto is merely volatile and not fraudulent, then you're just facilitating risky decisions, not scamming people. Doubly so if you believe even a tiny subset of DeFi provides net value, as many of FTX's customers still believe.
But in practice FTX was taking much more risky behavior, without telling its users, and in fact explicitly denying that such behavior was occurring. Nobody thought it was risky to deposit USD into FTX, if you hadn't bought any crypto. FTX assured users it wasn't. But if you have USD sitting on the site right now, there's a good chance you're never getting it back. To state the obvious: that's fraud, and it's wrong. And I think it's different than letting people take risks if they want to.
I would agree with this. Separate from the object-level causes of the current crisis, crypto as an industry has accepted and normalized a lack of accountability that other industries haven't. And I agree that lack of regulation and high volatility make fraud more likely.
I would want to avoid purely focusing on crypto, because I think the meta-lesson I might take away is less "crypto bad" and more "make sure donors and influential community members are accountable," whether that be to regulators, independent audits, or otherwise. (And accountable in a real due diligence sense, because it's easy for that word to just be an applause light.) But yes, skepticism of crypto-linked donors would be justified under this framework.
I think you may be getting a lot of disagree-votes because I don't think crypto was the issue here. People who just have USD sitting in FTX right now lost their money too.
FTX shouldn't have been risky. It wasn't a DAO, or based entirely off some token or chain, it was an exchange. It should have just been connecting people who wanted to buy crypto with people who wanted to sell crypto, and taking a fee for doing this. The exchange itself shouldn't be taking any risk.
The reason as to how looks at least in part to do with leveraged transactions, allowing customers to buy more crypto by supplementing their purchase with a loan. But we've let leveraged transactions happen with stock for a hundred years. This looks a lot more like garden-variety financial crime than some problem with crypto.
It's perhaps also worth separating the claims that A) previous alignment research was significantly less helpful than today's research and B) the reason that was the case continues to hold today.
I think I'd agree with some version of A, but strongly disagree with B.
The reason that A seems probably true to me is that we didn't know the basic paradigm in which AGI would arise, and so previous research was forced to wander in the dark. You might also believe that today's focus on empirical research is better than yesterday's focus on theoretical research (I don't necessarily agree) or at least that theoretical research without empirical feedback is on thin ice (I agree).
I think most people now think that deep learning, perhaps with some modifications, will be what leads to AGI - some even think that LLM-like systems will be sufficient. And the shift from primarily theoretical research to primarily empirical research has already happened. So what will cause today's research to be worse than future research with more capable models? You can appeal to a general principle of "unknown unknowns," but if you genuinely believe that deep learning (or LLMs) will eventually be used in future AGI, it seems hard to believe that knowledge won't transfer at all.