| ▲ | onlyrealcuzzo 4 hours ago | |
> The problem is that people are now building our world around tooling that eschews accountability. Management has doing a wonderful job of eschewing accountability for decades. It's a lot of people's dream to be able to say, yeah, our product doesn't work, but it's not OUR fault, and the client just shrug and grumble ai ai ai, and just put up with it because they know they can't get a better service anywhere else. It's not MY fault my website is down: it's Amazon's! It's not MY fault my app doesn't work: it's Claude Code's! | ||
| ▲ | bilbo0s 4 hours ago | parent [-] | |
Well just to be clear from a legal perspective, in the case of AI, as long as AI is "property", the owners, developers, and/or users will be held liable for things like the hypothetical fatal car accident that Sussman posits. Currently, from a legal perspective, AI is considered a "tool" without legal persona. So you sue the developer, the owner, or the user of the AI. (Just kidding, any lawyer worth his/her salt will sue all three! But you get the point.) Legally speaking, AI will probably be viewed that way for a long time. There are too many issues agitating against viewing it any other way. Owners will not give up property rights. No will to overbear. On and on and on. | ||