Remix.run Logo
mupuff1234 9 hours ago

[flagged]

gullibriem 9 hours ago | parent [-]

Circus-grade contortionism here.

mupuff1234 9 hours ago | parent [-]

Is it? Are you claiming nuclear bombs are not both essential and also a risk to national security?

Aren't all the AI companies saying that AI poses even a greater threat to humanity than nukes?

How can these two not be deeply connected? If a technology poses humanity extinction level of risk of course it will also be a matter of national security - how can it not be?

sampo 8 hours ago | parent | next [-]

> Aren't all the AI companies saying that AI poses even a greater threat to humanity?

20-30 years ago eco-terrorists bombed and burned down a number of biological research laboratories and other targets, because of the perceived risks of gene technology.

https://en.wikipedia.org/wiki/Earth_Liberation_Front#Notable...

Given all the current talk (and the famous scifi movies) about the risks of AI, I am a bit puzzled how there are no similar activists groups trying to sabotage AI facilities.

What is it that made the risk from gene manipulation feel so much more real and leading to actions, than the current AI risk? The Terminator movie franchise is more famous than any scifi movies about gene technology. (Edit: I guess Jurassic Park franchise surpasses The Terminator.)

anigbrowl 4 hours ago | parent [-]

Given all the current talk (and the famous scifi movies) about the risks of AI, I am a bit puzzled how there are no similar activists groups trying to sabotage AI facilities.

I am not. Anyone who understands the various downside risks and has a basic grasp of how the technology works also understands that compute is fungible and that there's no way to point at a given data center and be sure about whether it's providing search functionality, hosting cat pictures, enabling online shopping, training AI, or keeping planes from falling out of the sky. Even if you receive guidance in a vision that a given data center is bad, how do you deal with the reality of load balancing and the knowledge that the evil computation you hate won't be just hosted on a different server instance?

The Terminator movie franchise

I agree with you in that people probably do understand the existential risks of AI run riot better than many other possibilities due to those movies. But the problem is that the movies all depend on time travel. The unwilling human protagonists are persuaded to undertake drastic life altering criminal action based on information from The Future: both absolutely compelling demonstrations of technology from The Future (to justify the moral decision) and highly specific historical analysis from The Future (providing the operational gameplan).

I don't recall the specific plot crises of every movie, but all of them have well-defined success conditions, such as: ensuring the Terminator is destroyed and Sarah Connor survives; ensuring Cyberdyne Systems and the Terminators are destroyed and John Connor survives; ensuring the bad Terminator is destroyed before it can push the Skynet OS to production on every consumer computer device etc. For every dystopia-advancing use of time travel, there's a good use of time-travel helpfully pinpointing exactly where everything went wrong and what to do about it.

But back in the real world, even if you have absolute moral clarity that the creation of Skynet/the Torment Nexus/the Basilisk is imminent and must be stopped, how exactly do you go about this? I can think of a few people who have tried to attack data centers (for political/ideological reasons) and not only did they end up in federal prison, they also had no operational impact whatsoever. Realistically, we maintain a social status quo despite approximately quarterly assassinations, massacres of schooldren, or similar atrocities; why would any rational actor expect to alter the course of history by targeting a faceless abstraction? Even if the top ten tech CEOs were all simultaneously assassinated tomorrow, would things be substantively different a month later? Once the public freakout subsided, the companies would get new CEOs with much more proactive security details, a bunch of restrictive new laws would be promulgated, and everything would carryon more or less as before.

bubblewand 8 hours ago | parent | prev [-]

That's not wha the designation means. You're looking for some interpretation of the term that makes this not a contradiction, and such do exist to be found, but those aren't the correct definition.