A NY Times science journalist asked me whether there are any good papers, blogs, talks, etc from EA people on possible X-risks from the Search for Extraterrestrial Intelligence (SETI) or active Messaging to Extraterrestrial Intelligence (METI). They're interested in doing a feature piece on this.
Any suggestions?
I wrote a bit about this in a recent paper (https://www.primalpoly.com/s/Todd-Millerevpsych-of-ETIBioTheory2017.pdf), but haven't kept up on EA writings about possible downsides of alien contact.
So far, the most relevant work seems the Nick Bostrom paper on 'Information Hazards'? ( https://nickbostrom.com/information-hazards.pdf )?
When I hear about articles like this, I worry about journalists conflating "could be an X-risk" with "is an X-risk as substantial as any other"; journalism tends to wash out differences in scale between problems.
If you're still in communication with the author, I'd recommend emphasizing that this risk has undergone much less study than AI alignment or biorisk and that there is no strong EA consensus against projects like SETI. It may be that more people in EA would prefer SETI to cease broadcasting than to maintain the status quo, but I haven't heard about any particular person actively trying to make them stop/reconsider their methods. (That said, this isn't my area of expertise and there may be persuasion underway of which I'm unaware.)
I'm mostly concerned about future articles that say something like "EAs are afraid of germs, AIs, and aliens", with no distinction of the third item from the first two.
If such message will be a description of a computer and a program for it, it is net bad. Think about malevolent AI, which anyone able to download from stars.
Such viral message is aimed on the self-replication and thus will eventually convert Earth into its next node which use all our resources to send copies of the message farther.
Simple darwinian logic implies that such viral messages should numerically dominate between all alien messages if any exists. I wrote an article, linked below to discuss the idea in details