| ▲ | ls612 a day ago | ||||||||||||||||
The legal system has a word to describe software bugs --- it is called "negligence". And as the remedy starts being applied (aka "liability"), the enthusiasm for software will start to wane. What if anything do you think is wrong with my analogy? I doubt most people here support strict liability for bugs in code. | |||||||||||||||||
| ▲ | hnfong a day ago | parent | next [-] | ||||||||||||||||
I don't even think GP knows what negligence is. Generally the law allows people to make mistakes, as long as a reasonable level of care is taken to avoid them (and also you can get away with carelessness if you don't owe any duty of care to the party). The law regarding what level of care is needed to verify genAI output is probably not very well defined, but it definitely isn't going to be strict liability. The emotionally-driven hate for AI, in a tech-centric forum even, to the extent that so many commenters seem to be off-balance in their rational thinking, is kinda wild to me. | |||||||||||||||||
| |||||||||||||||||
| ▲ | jqpabc123 a day ago | parent | prev | next [-] | ||||||||||||||||
What if anything do you think is wrong with my analogy? I think what is clearly wrong with your analogy is assuming that AI applies mostly to software and code production. This is actually a minor use-case for AI. Government and businesses of all types ---doctors, lawyers, airlines, delivery companies, etc. are attempting to apply AI to uses and situations that can't be tested in advance the same way "vibe" code can. And some of the adverse results have already been ruled on in court. | |||||||||||||||||
| ▲ | senshan a day ago | parent | prev [-] | ||||||||||||||||
Very good analogy indeed. With one modification it makes perfect sense: > And as the remedy starts being applied (aka "liability"), the enthusiasm for sloppy and poorly tested software will start to wane. Many of us use AI to write code these days, but the burden is still on us to design and run all the tests. | |||||||||||||||||