Remix.run Logo
heftykoo 12 hours ago

Ah, the classic AI startup lifecycle:

We must build a moat to save humanity from AI.

Please regulate our open-source competitors for safety.

Actually, safety doesn't scale well for our Q3 revenue targets.

baq 9 hours ago | parent | next [-]

Foundational model provider manifesto:

‘While there’s value in safety, we value the Pentagon’s dollars more’

pera 6 hours ago | parent [-]

It turns out the biggest threat to AI safety is capitalism, who would have thought

samplatt 6 hours ago | parent | next [-]

Certainly not the prior century-and-a-half's worth of books and films.

Thanemate an hour ago | parent [-]

And I still run into naysayers claiming that we cannot extract valuable opinions or warnings from fiction because "they're fictional". Fiction comes from ideas. Fiction is not meant to model reality but approximate it to make a point either explicitly or implicitly.

Just because they're not 1:1 model of reality or predictions doesn't mean that the ideas they communicate are worthless.

peyton 6 hours ago | parent | prev | next [-]

I don’t get it. Even the Soviet Union used money. Simply paying for stuff isn’t necessarily capitalism? Or are you suggesting Anthropic should be state-owned?

jon-wood 5 hours ago | parent | next [-]

No, capitalism is prioritising profit over all other priorities, as we see happening here.

wongarsu 4 hours ago | parent | prev [-]

Using money as a medium to facilitate exchange of goods and services is not capitalism. Abandoning one of your core principles in the pursuit of money, or more charitably because not doing so means your competitors will make more money and overtake you in the marketplace is an outgrowth of capitalism

In the Soviet Union the reasons might have been "to beat the Capitalists", "for the pride of our country" or "Stalin asked us to and saying no means we get sent to Siberia". Though a variant of the last one may well have happened here, and the justification we read is just the one less damaging to everyone involved

gibsonsmog 3 hours ago | parent [-]

>Though a variant of the last one may well have happened here, and the justification we read is just the one less damaging to everyone involved

Hegseth was planning on getting the model via the Defense Production Act or killing Anthropic via supply chain risk classification preventing any other company working with the Pentagon from working with Anthropic. So while it wasn't Siberia, it was about as close as the US can get without declaring Claude a terrorist. Which I'm sure is on the table regardless

addandsubtract an hour ago | parent [-]

And you know Claude will be on the hook for any bad "decision" the military makes. So this will end poorly for them, anyway.

hiAndrewQuinn 6 hours ago | parent | prev [-]

Nick Land has basically been saying this since the 90s, if you can look past all the rhetoric

gom_jabbar an hour ago | parent [-]

Exactly. He recently said the following in an interview:

"AI safety and anti-capitalism [...] are at least strongly analogous, if not exactly the same thing." [0]

[0] Nick Land (2026). A Conversation with Nick Land (Part 2) by Vincent Lê in Architechtonics Substack. Retrieved from vincentl3.substack.com/p/a-conversation-with-nick-land-part-a4f

dmix 10 hours ago | parent | prev | next [-]

Once they are a dominant market leader they will go back to asking the government to regulate based on policy suggestions from non-profits they also fund.

amelius 3 hours ago | parent | next [-]

As if their shareholders would agree.

nielsbot 10 hours ago | parent | prev [-]

Is this sarcasm?

Frieren 6 hours ago | parent | next [-]

It is well know that big corporations take good regulations and change them to make them:

1. Easier to bypass for themselves.

2. Create extra work for incumbents.

3. Convince the public that the problems are solved so no other action is needed.

In many industries goverment and corporations work together to create regulations bypassing the social movements that asked for the industry to be regulated and their actual problems. The end result are regulations that are extremely complex to add exceptions for anything that big corporations paid to change instead of regulations that protect citizens and encourage competition.

deltoidmaximus 2 hours ago | parent [-]

See the Mattel lead painted toy scandal. The end result was congress passed regulations that manufacturers had to have their toys tested for lead and then made large companies like Mattel exempt from it because they were deemed large enough to handle it on their own. Even though they were the reason for the legislation because they weren't handling it on their own. Mattel sells lead painted toys and congress responds by hobbling their competitors.

bee_rider 10 hours ago | parent | prev | next [-]

I think it is cynicism; at least, there’s an idea that once a company is dominant it should want regulation, as it’ll stifle competition (since the competition has less capacity for regulatory hoop-jumping, or the competition will have had less time to do regulatory capture).

wiml 9 hours ago | parent | prev | next [-]

I wouldn't think so. Regulatory capture is a pretty typical activity for a dominant company.

Gud 8 hours ago | parent [-]

Why is this down voted? Happens all the time, the large corporations always try to block using regulatory capture.

lukan 6 hours ago | parent [-]

People not liking the concept, but shooting the messenger? (But seems not downvoted anymore.)

baq 9 hours ago | parent | prev [-]

sama did just that a couple years ago

yesimahuman 42 minutes ago | parent | prev | next [-]

The only surprise is how quickly it all happened!

jwr 6 hours ago | parent | prev | next [-]

It's not just AI, replace "safe" with "open" and you will find a close match with many companies. I guess the difference is that after the initial phase, we are continuously being gaslighted by companies calling things "open" when they are most definitely not.

varispeed 6 hours ago | parent | prev [-]

Politicians also love to regulate, especially over wine and steak and when the watchers don't watch.