Yeah, my understanding is there is debate about whether the loss in EV from having an emergency fund in low yield low risk assets is offset by the benefits of reduced risk. The answer will depend on personal risk tolerance, current net worth, expected career volatility, etc. The main point of my comment was just that a lot of people use default low yield savings accounts even though there's no reason to do that at all.
That's a fair point, but a lot of the scenarios you describe would mean rapid economic growth and equities going up like crazy. The expectation of my net worth in 40 years on my actual views is way, way higher than it would be if I thought AI was totally fake and the world would look basically the same in 2065. That doesn't mean you shouldn't save up though (higher yields are actually a reason to save, not a reason to refrain from saving).
Thanks for this, Trevor.
For what it's worth: a lot of people think emergency fund means cash in a normal savings account, but this is not a good approach. Instead, buy bonds or money market funds with your emergency savings, or put them in a specialized high yield savings account (which to repeat is likely NOT a savings account that you get by default from your bank).
Or just put the money in equities in a liquid brokerage account.
In the case at hand, Matthew would have had to at some point represent himself as supporting slowing down or stopping AI progress. For at least the past 2.5 years, he has been arguing against doing that in extreme depth on the public internet. So I don't really see how you can interpret him starting a company that aims to speed up AI as inconsistent with his publicly stated views, which seems like a necessary condition for him to be a "traitor". If Matthew had previously claimed to be a pause AI guy, then I think it would be more reasonable for other adherents of that view to call him a "traitor." I don't think that's raising the definitional bar so high that no will ever meet it---it seems like a very basic standard.
I have no idea how to interpret "sellout" in this context, as I have mostly heard that term used for such situations as rappers making washing machine commercials. Insofar as I am familiar with that word, it seems obviously inapplicable.
From an antirealist perspective, at least on the 'idealizing subjectivism' form of antirealism, moral uncertainty can be understood as uncertainty about the result of an idealization process. Under this view, there exists some function that takes your current, naive values as input and produces idealized values as output—and your moral uncertainty is uncertainty about the output.
The idea of mutual aid comes from anarcho-communist philosopher Peter Kropotkin.
I also don't think it is accurate that peasant farming is more productive per hectare than capital intensive large scale farms.