▲ | colechristensen 2 days ago | |||||||||||||
>What happens when people really will die if the model does or does not do the thing? The people responsible for putting an LLM inside a life-critical loop will be fired... out of a cannon into the sun. Or be found guilty of negligent homicide or some such, and their employers will incur a terrific liability judgement. | ||||||||||||||
▲ | stirfish 2 days ago | parent | next [-] | |||||||||||||
More likely that some tickets will be filed, a cost function somewhere will be updated, and my defense industry stocks will go up a bit | ||||||||||||||
▲ | a4isms 2 days ago | parent | prev [-] | |||||||||||||
Has this consequence happened with self-driving automobiles on open roads in the US of A when people died in crashes? If not, why not? | ||||||||||||||
|