| ▲ | ozgung 3 hours ago | |
Do I understand this correctly: An algorithm, an ML model trained to predict next tokens to write meaningful text, is going to KILL actual humans by itself. So killing people is legal, Killing people by a random process is legal, A randomized algorithm deciding on who to kill is legal, And some of you think you are legally protected because they used the word “domestic”? | ||
| ▲ | techpression 3 hours ago | parent | next [-] | |
Domestic means nothing, it’s like the company Daniel Ek invested in saying they won’t sell weapons to ”Democracies”, in the context of warfare and control these words are meaningless. They will deploy this on a domestic scale and claim to use it to locate non-domestic threats. I can’t believe anyone is falling for this. | ||
| ▲ | booleandilemma 2 hours ago | parent | prev [-] | |
Is it possible the killing machine could hallucinate and kill some random, innocent person? | ||