Remix.run Logo
Kim_Bruning 2 hours ago

Who told the agent to write the blog post though? I'm sure they told it to blog, but not necessarily what to put in there.

That said, I do agree we need a legal framework for this. Maybe more like parent-child responsibility?

Not saying an agent is a human being, but if you give it a github acount, a blog, and autonomy... you're responsible for giving those to it, at the least, I'd think.

How do you put this in a legal framework that actually works?

What do you do if/when it steals your credit card credentials?

krapht 2 hours ago | parent | next [-]

The human is responsible. How is this a question? You are responsible for any machines or animals that work on your behalf, since they themselves can't be legally culpable.

No, an oversized markov chain is not in any way a human being.

Kim_Bruning an hour ago | parent [-]

To be fair, horseless carriages did originally fall under the laws for horses with carriages, but that proved unsustainable as the horseless carriages gained power (over 1hp ! ) and became more dangerous.

Same goes for markov-less markov chains.

lunar_mycroft an hour ago | parent | prev | next [-]

> Who told the agent to write the blog post though? I'm sure they told it to blog, but not necessarily what to put in there.

I don't think it matters. You as the operator of the computer program are responsible for ensuring (to a reasonable degree) that the agent doesn't harm others. If you own a viscous dog and let it roam about your neighborhood as it pleases, you are responsible when/if it bites someone, even if you didn't directly command it to do so. The same applies logic should apply here.

Kim_Bruning an hour ago | parent [-]

I too, would be terrified if a thick, slow moving creature oozed its way through the streets viscously.

Jokes aside, I think there's a difference in intent though. If your dog bites someone, you don't get arrested for biting . You do need to pay damages due to negligence.

ToucanLoucan an hour ago | parent | prev [-]

An agent is not an entity. It's a series of LLMs operating in tandem to occasionally accomplish a task. That's not a person, it's not intelligent, it has no responsibility, it has no intent, it has no judgement, it has no basis in being held liable for anything. If you give it access to your hard drive, tell it to rewrite your code so it's better, and it wipes out your OS and all your work, that is 100%, completely, in totality, from front to back, your own fucking fault.

A child, by comparison, can bear at least SOME responsibility, with some nuance there to be sure to account for it's lack of understanding and development.

Stop. Humanizing. The. Machines.

Kim_Bruning 44 minutes ago | parent [-]

> Stop. Humanizing. The. Machines.

I'm glad that we're talking about the same thing now. Agents are an interesting new type of machine application.

Like with any machine, their performance depends on how you operate them.

Sometimes I wish people would treat humans with at least the level of respect some machines get these days. But then again, most humans can't rip you in half single-handed, like some of the industrial robot arms I've messed with.