Remix.run Logo
tough a day ago

An llm isnt subject to external consequences like human beings or corporations

because they’re not legal entities

hoofedear a day ago | parent | next [-]

Which makes sense that it wouldn't "know" that, because it's not in it's context. Like it wasn't told "hey, there are consequences if you try anything shady to save your job!" But what I'm curious about is why it immediately went to self preservation using a nefarious tactic? Like why didn't it try to be the best assistant ever in an attempt to show its usefulness (kiss ass) to the engineer? Why did it go to blackmail so often?

elictronic a day ago | parent | next [-]

LLMs are trained on human media and give statistical responses based on that.

I don’t see a lot of stories about boring work interactions so why would its output be boring work interaction.

It’s the exact same as early chatbots cussing and being racist. That’s the internet, and you have to specifically define the system to not emulate that which you are asking it to emulate. Garbage in sitcoms out.

a day ago | parent | prev | next [-]
[deleted]
a day ago | parent | prev [-]
[deleted]
eru a day ago | parent | prev [-]

Wives, children, foreigner, slaves etc weren't always considered legal entities in all places. Were they free of 'external consequences' then?

tough a day ago | parent [-]

An llm doesnt exist in the physical world which makes punishing it for not following the law a bit hard

eru a day ago | parent [-]

Now that's a different argument to what you made initially.

About your new argument: how are we (living in the physical world) interacting with this non-physical world that LLMs supposedly live in?

tough a day ago | parent [-]

that doesn't matter because they're not alive either but yeah i'm digressing i guess