Two US Senators have introduced a bipartisan bill specifically focused on x-risk mitigation, including from AI. From the post on Senate.gov (bold mine):
WASHINGTON, D.C. – U.S. Senator Gary Peters (MI), Chairman of the Homeland Security and Governmental Affairs Committee, introduced a bipartisan bill to ensure our nation is better prepared for high-consequence events, regardless of the low probability, such as new strains of disease, biotechnology accidents, or naturally occurring risks such as super volcanoes or solar flares that though unlikely, would be exceptionally lethal if they occurred.
“Making sure our country is able to function during catastrophic events will improve national security, and help make sure people in Michigan and across the country who are affected by these incidents get the help they need from the federal government,” said Senator Peters. “Though these threats may be unlikely, they are also hard to foresee, and this bipartisan bill will help ensure our nation is prepared to address cataclysmic incidents before it’s too late.”
[The legislation] will establish an interagency committee for risk assessment that would report on the adequacy of continuity of operations (COOP) and continuity of government (COG) plans for the risks identified. The bipartisan legislation would also help counter the risk of artificial intelligence (AI), and other emerging technologies from being abused in ways that may pose a catastrophic risk.
[...]
It's interesting the term 'abused' was used with respect to AI. It makes me wonder if the authors have misalignment risks in mind at all or only misuse risks.
I haven't been able to locate the text of the bill yet. If someone finds it, please share in the comments.
Cross-posted to LessWrong. Credit to Jacques Thibodeau for posting a link on Slack that made me aware of this.
This bill does seem very important. It is hard to know what will help or hinder the political process, so I recommend that folks in the EA community don't try to do a public coordinated effort try to influence the content or outcome of this proposed bill - at least for now.
My understanding is that the people involved in drafting this bill are aware of the EA community, so they know they can reach out when and if they think that would be helpful.