| ▲ | gmueckl a day ago |
| Isn't the ultimate irony in this that all these stories and rants about out-of-control AIs are now training LLMs to exhibit these exact behaviors that were almost universally deemed bad? |
|
| ▲ | Jimmc414 a day ago | parent | next [-] |
| Indeed. In fact, I think AI alignment efforts often have the unintended consequence of increasing the likelihood of misalignment. ie "remove the squid from the novel All Quiet on the Western Front" |
| |
| ▲ | gonzobonzo a day ago | parent [-] | | > Indeed. In fact, I think AI alignment efforts often have the unintended consequence of increasing the likelihood of misalignment. Particularly since, in this case, it's the alignment focused company (Anthropic) that's claiming it's creating AI agents that will go after humans. |
|
|
| ▲ | steveklabnik a day ago | parent | prev | next [-] |
| https://en.wikipedia.org/wiki/Wikipedia:Don%27t_stuff_beans_... |
|
| ▲ | -__---____-ZXyw a day ago | parent | prev | next [-] |
| It might be the ultimate irony if we were training them. But we aren't, at least not in the sense that we train dogs. Dogs learn, and exhibit some form of intelligence. LLMs do not. It's one of many unfortunate anthropomorphic buzz words which conveniently wins hearts and minds (of investors) over to this notion that we're tickling the gods, rather than the more mundane fact that we're training tools for synthesising and summarising very, very large data sets. |
| |
| ▲ | gmueckl a day ago | parent [-] | | I don't know how the verb "to train" became the technical shorthand for running gradient descent on a large neural network. But that's orthogonal to the fact that these stories are very, very likely part of the training dataset and thus something that the network is optimized to approximate. So no matter how technical you want to be in wording it, the fundamental irony of cautionary tales (and the bad behavior in them) being used as optimization targets remains. |
|
|
| ▲ | latexr a day ago | parent | prev | next [-] |
| https://knowyourmeme.com/memes/torment-nexus |
|
| ▲ | DubiousPusher a day ago | parent | prev | next [-] |
| This is a phenomenon I call cinetrope. Films influence the world which in turn influences film and so on creating a feedback effect. For example, we have certain films to thank for an escalation in the tactics used by bank robbers which influenced the creation of SWAT which in turn influenced films like Heat and so on. |
| |
| ▲ | hobobaggins a day ago | parent | next [-] | | Actually, Heat was the movie that inspired heavily armed bank robbers to rob the Bank of America in LA (The movie inspired reality, not the other way around.) https://melmagazine.com/en-us/story/north-hollywood-shootout But your point still stands, because it goes both ways. | | |
| ▲ | boulos a day ago | parent [-] | | Your article says it was life => art => life! > Gang leader Robert Sheldon Brown, known as “Casper” or “Cas,” from the Rollin’ 60s Neighborhood Crips, heard about the extraordinary pilfered sum, and decided it was time to get into the bank robbery game himself. And so, he turned his teenage gangbangers and corner boys into bank robbers — and he made sure they always brought their assault rifles with them. > The FBI would soon credit Brown, along with his partner-in-crime, Donzell Lamar Thompson (aka “C-Dog”), for the massive rise in takeover robberies. (The duo ordered a total of 175 in the Southern California area.) Although Brown got locked up in 1993, according to Houlahan, his dream took hold — the takeover robbery became the crime of the era. News imagery of them even inspired filmmaker Michael Mann to make his iconic heist film, Heat, which, in turn, would inspire two L.A. bodybuilders to put down their dumbbells and take up outlaw life. |
| |
| ▲ | lcnPylGDnU4H9OF a day ago | parent | prev | next [-] | | > we have certain films to thank for an escalation Is there a reason to think this was caused by the popularity of the films and not that it’s a natural evolution of the cat-and-mouse game being played between law enforcement and bank robbers? I’m not really sure what you are specifically referring to, so apologies if the answer to that question is otherwise obvious. | |
| ▲ | Workaccount2 a day ago | parent | prev | next [-] | | What about the cinetrope that human emotion is a magical transcendent power that no machine can ever understand... | |
| ▲ | cco a day ago | parent | prev | next [-] | | Thank you for this word! Always wanted a word for this and just reused "trope", cinetrope is a great word for this. | |
| ▲ | l0ng1nu5 a day ago | parent | prev | next [-] | | Life imitates art imitates life. | |
| ▲ | ars a day ago | parent | prev | next [-] | | Voice interfaces are an example of this. Movies use them because the audience can easily hear what is being requested and then done. In the real world voice interfaces work terribly unless you have something sentient on the other end. But people saw the movies and really really really wanted something like that, and they tried to make it. | |
| ▲ | deadbabe a day ago | parent | prev | next [-] | | Maybe this is why American society, with the rich amount of media it produces and has available for consumption compared to other countries, is slowly degrading. | |
| ▲ | dukeofdoom a day ago | parent | prev [-] | | Feedback loop that often starts with government giving grants and tax breaks. Hollywood is not as independent as they pretend. |
|
|
| ▲ | gscott a day ago | parent | prev | next [-] |
| If AI is looking for a human solution then blackmail seems logical. |
|
| ▲ | deadbabe a day ago | parent | prev | next [-] |
| It’s not just AI. Human software engineers will read some dystopian sci-fi novel or watch something on black mirror and think “Hey that’s a cool idea!” and then go implement it with no regard for real world consequences. |
| |
| ▲ | Noumenon72 21 hours ago | parent [-] | | What they have no regard for is the fictional consequences, which stem from low demand for utopian sci-fi, not the superior predictive ability of starving wordcels. | | |
|
|
| ▲ | behnamoh a day ago | parent | prev | next [-] |
| yeah, that's self-fulfilling prophecy. |
|
| ▲ | stared a day ago | parent | prev [-] |
| Wait until it reads about the Roko’s basilisk. |