| ▲ | bertil 6 hours ago | |
Can their solution recommend to shoot at combatants lost at sea? This is key because it's the textbook example of a war crime. It's also something that the current administration has bragged doing dozens of times. More succinctly: who decides what is legal here? OpenAI, the Secretary of Defense, or a judge? | ||
| ▲ | godelski 3 hours ago | parent | next [-] | |
Why are people concentrating on legality? Look at the language
It's not just "legal". Their usage just needs to be consistent with one of
Operational requirements might just be a free pass to do whatever they want. The well established protocols seems like a distraction from the second condition.
The Secretary of Defense. The same person who has directed people to do extrajudicial killings. Killings that would be war crimes even if those people were enemy combatants.There's also subtle language elsewhere. Notice the word "domestic" shows up between "mass" and "surveillance"? We already have another agency that's exploited that one... | ||
| ▲ | fluidcruft 6 hours ago | parent | prev | next [-] | |
The more relevant question is who is held accountable for the war crimes? OpenAI seem pretty confident it won't be OpenAI. I can see the logic if we were talking about dumb weapons--the old debate about guns don't kill people, people kill people. Except now we are in fact talking about guns that kill people. | ||
| ▲ | saghm 5 hours ago | parent | prev [-] | |
> This is key because it's the textbook example of a war crime. It's also something that the current administration has bragged doing dozens of times. > More succinctly: who decides what is legal here? OpenAI, the Secretary of Defense, or a judge? Yeah, there's a pretty strong case that anyone claiming to trust that the administration cares about operating in good faith with respect to the law is either delusional or lying. | ||