Remix.run Logo
zmmmmm 3 hours ago

It seems like the people using these are writing off the risks - either they think it's so unlikely to happen it doesn't matter or they assume they won't be held responsible for the damage / harm / loss.

So I'm curious how it will go down once serious harm does occur. Like someone loses their house, or their entire life savings or have their identity completely stolen. And these may be the better scenarios, because the worse ones are it commits crimes, causes major harm to third parties, lands the owner in jail.

I fully expect the owner to immediately state it was the agent not them, and expect they should be alleviated of some responsibility for it. It already happened in the incident with Scott Shambaugh - the owner of the bot came forward but I didn't see any point where they did anything to take responsibility for the harm they caused.

These people are living in a bubble - Scott is not suing - but I have to assume whenever this really gets tested that the legal system is simply going to treat it as what it is: best case, reckless negligence. Worst case (and most likely) full liability / responsibility for whatever it did. Possibly treating it as with intent.

Unfortunately, it seems like we need this to happen before people will actually take it seriously and start to build the necessary safety architectures / protocols to make it remotely sensible.

selridge 2 hours ago | parent [-]

"Scott is not suing"

For what?