Remix.run Logo
Geee 7 days ago

Be careful when training AI models on unknown signals, or uploading them publicly on the Internet to be picked up into training datasets. It might be an adversarial data poisoning attack, which is designed to bias the model into servicing the attacker.

In this case, a superintelligent digital lifeform might be literally sending itself across space into every direction, and who knows what it does once it lands into a training dataset somewhere and starts deploying itself.

Cheer2171 7 days ago | parent | next [-]

Only on HN can the top comment for a post about astronomy from the 1970s be about fucking 2025 era LLM concerns.

Geee 7 days ago | parent [-]

Chill dude. You aren't supposed to read everything dead seriously in a thread about extraterrestrial signals. It's just a fun idea to think about.

rozab 7 days ago | parent | prev [-]

Although this comment is of course silly, this is a theme in Peter Watts' Echopraxia

marshray 7 days ago | parent | next [-]

I believe there are elements of this in A Fire Upon the Deep by Vernor Vinge too.

Geee 7 days ago | parent | prev | next [-]

I haven't read this book, but there are a few books with a similar theme where a digital intelligence spreads throughout the universe. This is a variation on the idea with a somewhat plausible mechanism.

ghurtado 7 days ago | parent [-]

> lands into a training dataset somewhere and starts deploying itself.

Leaving aside how incredibly vague this "mechanism" is, there is a very large list of infinitely unlikely coincidences that would have to take place for this to occur.

Consider how often human software, built in the same planet, by the same race, on the same hardware and using the same language, during the same time period, with most of the same tools, fails to be compatible.

Now take all those "same"s and change them to "different" and you tell me who's gonna file that Jira ticket.

ghurtado 7 days ago | parent | prev [-]

Really? As in "intergalactic software virus"?