| ▲ | godelski 4 hours ago | |
The problem with government contracts where you say "can't do anything illegal" is that THEY DECIDE WHAT IS LEGAL. We're lucky we live in a system where you can challenge the government but I think either side of the isle you're on you think people are trying to dismantle that feature (we just disagree on who is doing that, right?).<edit> THAT'S EXACTLY WHAT DARIO WAS ARGUING and it is exactly why the DOD wanted to get around. They wanted to use Claude for all legal purposes and Anthropic said moral reasons. Also notice the subtle language in OpenAI's red lines. "No use of OpenAI technology for mass *domestic* surveillance." We've seen how this was abused by the NSA already since normal communication in the Internet often crosses international lines. And what they couldn't get done that way they got around through allies who can spy on American citizens. </edit> I think we need to remember that legality != morality. It's our attempt to formalize morality but I think everyone sees how easy it is to skirt[0]
Call your senators. There's a bill in the senate explicitly about this. Here's the EFF's take [1]. IMO it's far from perfect but an important step. I think we should talk about this more. I have problems with it too, but hey, is anything in here preventing things from continuing to get better? It's too easy to critique and then do nothing. We've been arguing for over a decade, I'd rather take a small step than a step back.
Let's also not forget WorldCoin[2]. World (blockchain)? World Network?I have no trust for Altman. His solution to distinguishing humans from bots is mass biometric surveillance. This seems as disconnected as the CEO of Flock or that Ring commercial. Not to mention all the safety failures. Sora was released allowing real people to be generated? Great marketing. Glad they "fixed it" so quickly... There's a lot happening now and it's happening fast. I think we need to be careful. We've developed systems to distribute power but it naturally wants to accumulate. Be it government power or email providers. The greater the power, the greater the responsibility. But isn't that why we created distributed power systems in the first place? Personally I don't want autonomous unquestioning killbots under the control of one or a small number of people. Even if you don't believe the one in control now is not a psychopath (-_-) then you can still agree that it's possible for that type of person to get control. Power corrupts. Things like killing another person should be hard, emotionally. That's a feature, not a flaw. Soldiers questioning orders is a feature, not a flaw. By concentrating power you risk handing that power to those that do not feel. We're making Turnkey Tyranny more dangerous [0] and law is probably our best attempt to make a formal system out of a natural language but I digress [1] https://www.eff.org/deeplinks/2024/04/fourth-amendment-not-s... | ||