Global moratorium on AGI, now (Twitter). Founder of CEEALAR (née the EA Hotel; ceealar.org)
[crossposted from Manifund]
donated $90,000
It's more important than ever that PauseAI is funded. Pretty much the only way we're going to survive the next 5-10 years is by such efforts being successful to the point of getting a global moratorium on further AGI/ASI development. There's no point being rich when the world ends. I encourage others with 7 figures or more of net worth to donate similar amounts. And I'm disappointed that all the big funders in the AI Safety space are still overwhelmingly focused on Alignment/Safety/Control when it seems pretty clear that those aren't going to save us in time (if ever), given the lack of even theoretical progress, let alone practical implementation.
3 is more important than ever now, following the recommendation by the bipartisan U.S.-China Economic and Security Review Commission for a Manhattan Project on AGI.
It's no secret that AI Safety / EA is heavily invested in AI. It is kind of crazy that this is the case though. As Scott Alexander said:
Imagine if oil companies and environmental activists were both considered part of the broader “fossil fuel community”. Exxon and Shell would be “fossil fuel capabilities”; Greenpeace and the Sierra Club would be “fossil fuel safety” - two equally beloved parts of the rich diverse tapestry of fossil fuel-related work. They would all go to the same parties - fossil fuel community parties - and maybe Greta Thunberg would get bored of protesting climate change and become a coal baron.
This is how AI safety works now.
Going to flag that a big chunk of the major funders and influencers in the EA/Longtermist community have personal investments in AGI companies, so this could be a factor in lack of funding for work aimed at slowing down AGI development. I think that as a community, we should be divesting (and investing in PauseAI instead!)
o1 is further evidence that we are living in a short timelines world, that timelines are short, p(doom) is high: a global stop to frontier AI development until x-safety consensus is our only reasonable hope.
One high leverage thing people could do right now is encourage letter writing to California's Governor Newsom requesting he signs SB 1047. This would be a much needed precedent for enabling US federal legislation and then global regulation.
(This was 1 Bitcoin btw. Austin helped me with the process of routing it to Manifund, allowing me to donate ~32% more, factoring in avoiding capital gains tax in the UK).