| ▲ | alecco 15 hours ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
But the "Anthropic fight" is mostly fake. Palantir was using Claude as base model. Anthropic allegedly took issue with unsupervised kills because the technology wasn't ready (or something along the lines). Also, I remember reading this guy has close ties to Anthropic. Also, I find it suspicious how he came to prominence out of nowhere. Like Big Tech and the establishment are propping podcasts of controlled narrative/opposition. I don't buy any of it. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | mips_avatar 15 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
He's roommates with an Anthropic researcher, I was roommates with a Google product manager I don't think I'm really bought out by Google. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | rustyhancock 15 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Also Anthropic has made very clear they align closely with the DoW. Really Anthropic doesn't seem to be fighting for anyone but a narrow subset of people. So who cares, none of the but AI providers are particularly ethical. Pick your poison as your conscious and needs allow. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | nipponese 13 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
In my view, this guy's podcast didn't get big talking about AI, it's best known for cold war history and foreign policy discussions with Sarah Paine of the U.S. Naval War College. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | piyh 14 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
>I find it suspicious how he came to prominence out of nowhere He was first funded by FTX | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | cuuupid 14 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It's entirely fake, sure Palantir uses Claude, but it takes about 10 minutes to pull all their federal contracts and realize the little involvement they have in the kill chain is preliminary | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | observationist 14 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It doesn't matter what you know so much as who you know. Networking is the most precious currency. He met the right people, got the right guests, and surfed a wave of fortunate occurrences. He was roommates with Dylan Patel of TheIjnformation, and John Y of Asionometry, and has since developed a wide range of high level industry contacts. Sometimes people succeed without earning it, and what matters is what they do with the success afterwards. I'd say Dwarkesh earned it, but got lucky and caught the right waves, and has surfed the hell out of his success. He's had consistently well informed, level headed takes, and has engaged the field with insight and honest curiousity. When I see people surf like that, I applaud it. There's nothing grifty or shady, he's just had a great series of excellent opportunities and has played them for everything they're worth. Once he had a few billionaires on, that was all the social cache he needed to continue attracting guests and high level researchers and other figures in AI. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | Readerium 15 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Yes he is room-mates with. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | serguzest 10 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It seems to me that AI target-selection systems are being used not just for efficiency, but as a way to distance military staff from responsibility for what they are killing. Current AI models naturally speculate and hallucinate if you don’t tightly constrain them. we see this all the time as software engineers when working with agentic coding. This creates a dangerous dynamic. AI can generate targets that a human operator might not be able to justify manually, and when something goes wrong the blame can always be shifted to the system, such as the recent incident where roughly 180 children were killed due to faulty targeting. Israel’s way of fighting this war looks more like pure destruction than a conventional military campaign, and AI systems like this are very easy to abuse in that context. At this point it’s clear that even the U.S. is willing to eliminate targets even when the collateral damage includes the person’s family or neighbors. I don’t think that would have been acceptable in previous administrations. Israel has lowered the bar. That may be why Anthropic moved early to denounce this kind of usage, even though they had previously partnered with the Department of War. Now let’s look at the statements made by Anthropic and Hegseth: https://www.anthropic.com/news/where-stand-department-war https://x.com/SecWar/status/2027507717469049070 From Anthropic’s own statement, we hear that they have actually been quite closely partnered. In Hegseth’s tweet we see: “Anthropic will continue to provide the Department of War its services for a period of no more than six months to allow for a seamless transition to a better and more patriotic service.” This shows that Anthropic is still currently being actively used by the Department of War. My view is that Anthropic and its investors eventually realized that the American war machine will use their technology in reckless ways, and that this will certainly create a massive PR disaster or, in an ideal world, even legal consequences. That realization likely pushed them to adopt what they now frame as a more “humanitarian” position. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||