| ▲ | smallmancontrov 2 days ago |
| The word "lawful" always seems to get dragged out when people in power are doing some especially heinous rulemaking, like throwing a hissy fit over a single company trying to voluntarily draw a line at domestic surveillance and fully automated killchains. |
|
| ▲ | bko 2 days ago | parent | next [-] |
| A private corporation can choose not to sell to the government. A lot of them do exactly this. A lot of hoops to jump through. However, if they do sell to the government, they shouldn't have some sneaky way to exert control over decision making using their products. We're a country of laws, and for better or for worse, these laws are made by elected officials and those appointed by elected officials. Why an American company wouldn't want American defense to have the most capable tools at their disposal is a different matter all together, but here we are. |
| |
| ▲ | hvb2 2 days ago | parent | next [-] | | Your court system wasn't designed for the Executive branch acting with actual bad intent. You're a country of laws, but if enforcing them takes months if not years... Then during that time, you're the wild wild west | | |
| ▲ | DennisP 2 days ago | parent | next [-] | | The system also wasn't designed for presidential immunity. Combining that with unlimited federal pardons, we're the wild west permanently, or at least until that decision is overturned. | | |
| ▲ | Nasrudith a day ago | parent [-] | | I suspect cynically that as soon as someone not a republican takes power the presidential immunity will magically evaporate in a burst of bad faith jurisprudence. |
| |
| ▲ | remarkEon 20 hours ago | parent | prev [-] | | This comment is hilariously incorrect. Courts stop the Executive branch all the time. You do not know what you're talking about. |
| |
| ▲ | Hammershaft 6 hours ago | parent | prev | next [-] | | There's nothing sneaky about terms & conditions. If the gov wants a service they legally need to abide by its terms, same as us, if they don't like it they should choose another product. Anthropic doesn't want their AI used for misaligned mass surveillance scanners and killbots, there are obvious reasons they might not want that. | |
| ▲ | joshuamorton 2 days ago | parent | prev | next [-] | | > they shouldn't have some sneaky way to exert control over decision making using their products. why not, many companies have all sorts of rules you agree to when using their products, including many legal ("lawful") things. Are you saying that the government as a client should be unbound by contractual obligations that apply to other clients? | | |
| ▲ | throwup238 2 days ago | parent [-] | | Governments negotiate their own contracts with their own terms of service. That’s one of the hoops government contractors jump through. | | |
| ▲ | thayne 2 days ago | parent | next [-] | | That's fine as long as the company can choose they don't like those terms and refuse to do business. But in this case the government threatened, and carried out the threat, of classifying Anthropic as a "supply chain threat" if they didn't agree to the government's terms. | |
| ▲ | kube-system 2 days ago | parent | prev | next [-] | | Not only that, but some of the contractual terms are defined by federal acquisition law, et al. | |
| ▲ | joshuamorton 2 days ago | parent | prev [-] | | I want to be clear, I agree. I have no objection to unique government contracts. I'm specifically curious about GPs position that a government contractor should be (ethically?) bound from putting contractual obligations on government use of their service. Like the various ai providers limit lawful use like creating AI pornography. I think it would be reasonable to keep a contractual restriction against that even when working with the government. |
|
| |
| ▲ | tombert 2 days ago | parent | prev | next [-] | | This administration has made it very clear that they will do what they can to change laws whenever convenient, without congressional oversight, whether or not they are "allowed" to. Trump implemented tariffs he wasn't allowed to immediately, he started a war he probably wasn't allowed to in order to (allegedly) distract from associating with a pedophile, he wrote an executive order trying to undo the fourteenth amendment, he has actively been abducting and imprisoning lawful residents (and even citizens!) and actively pushed for racial profiling to do so. If a company feels like the government will simply rewrite the laws in order to advance any kind of political whim (including to be weaponized against that very company!), it's not wrong or even weird for them to want to add safeguards to their product. To be clear, this isn't weird or uncommon. Lots the stuff you sign in the EULA isn't preventing you from doing things that are "illegal". | |
| ▲ | a day ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | WarmWash 2 days ago | parent | prev [-] |
| Anthropic wanted the ability to verify compliance whereas OAI and Google are fine with "trust us". Which is how it always is, and always has been. For better or worse, the government is the one who audits, and has it's own internal systems for self audits. So no one except them tells them what they can or cannot do. The government would never put itself in a position where civilians died because Amodei didn't like the vibe of the case being worked. In a way it's wild that people are upset that the government didn't put a billionaire megacorp CEO in the drivers seat of intelligence. |
| |
| ▲ | ffsm8 2 days ago | parent | next [-] | | It's incredible if you honestly believe that. The only reason this blew up at all was because of the insane overreach by the DoW after anthropic voiced their concern. It was well within anthropic right to do so, as it was part of their contract. And it would've been very understandable that the DoW balked at that, though the real issue would be the incompetence how the contract was able to get through with that in it. But with that contact in place, the only sensible action would've been to terminate the contract and move on. Frankly, nobody would've cared. But the DoW felt it just had to go further... And their chosen action was just an insane overreach - hence the controversy. | |
| ▲ | anticensor 2 days ago | parent | prev | next [-] | | Anthropic wanted the ability to verify compliance whereas OAI and Google went "OK no verification but then we won't give you the weights". | |
| ▲ | trhway 2 days ago | parent | prev [-] | | >So no one except them tells them what they can or cannot do. you're missing "laundering the responsibility" approach - find a lawyer who writes that the thing is legal in his opinion, and voila. |
|