| ▲ | DesaiAshu 5 hours ago | |||||||||||||||||||
It's not "just" a $200m contract, it's the start of a lucrative relationship 1. Stargate seemed to require a dedicated press conference by the President to achieve funding targets. Why risk that level of politicization if it didn't? 2. Greg Brockman donated $25mil to Trump MAGA Super PAC last year. Why risk so much political backlash for a low leverage return of $200m on $25m spent? 3. During WW2, military spend shot from 2% to 40% of GDP. The administration is requesting $1.5T military budget for FY2027, up from $0.8T for FY2025. They have made clear in the past 2 months that they plan to use it and are not stopping anytime soon If you believe "software eats the world" it is reasonable to expect the share of total military spend to be captured by software companies to increase dramatically over the next decade. $100B (10% of capture) is a reasonable possibility for domestic military AI TAM in FY2027 if the spending increase is approved (so far, Republicans have not broken rank with the administration on any meaningful policy) If US military actions continue to accelerate, other countries will also ratchet up military spend - largely on nuclear arsenals and AI drones (France already announced increase of their arsenal). This further increases the addressable TAM Given the competition and lack of moat in the consumer/enterprise markets, I am not sure that there is a viable path for OpenAI to cover it's losses and fund it's infrastructure ambitions without becoming the preferred AI vendor for a rapidly increasing military budget. The devices bet seems to be the most practical alternative, but there is far more competition both domestically (Apple, Google, Motorola) and globally (Xiaomi, Samsung, Huawei) than there is for military AI Having run an unprofitable P&L for a decade, I can confidently state that a healthy balance sheet is the only way to maintain and defend one's core values and principles. As the "alignment" folks on the AI industry are likely to learn - the road to hell (aka a heavily militarized world) is oft paved with the best intentions | ||||||||||||||||||||
| ▲ | solenoid0937 5 hours ago | parent [-] | |||||||||||||||||||
First, I have to say I loved your thoughtful & detailed comment. You have clearly considered this from the financial side; let me add some color from the perspective of someone working with frontier researchers. > As the "alignment" folks on the AI industry are likely to learn I will push back here. Dario & co are not starry-eyed naive idealists as implied. This is a calculated decision to maximize their goal (safe AGI/ASI.) You have the right philosophy on the balance sheet side of things, but what you're missing is that researchers are more valuable than any military spend or any datacenter. It does not matter how many hundreds of billions you have - if the 500-1000 top researchers don't want to work for you, you're fucked; and if they do, you will win because these are the people that come up with the step-change improvements in capability. There is no substitute for sheer IQ: - You can't buy it (god knows Zuck has tried, and failed to earn their respect). - You can't build it (yet.) - And collaboration amongst less intelligent people does not reliably achieve the requisite "Eureka" realizations. Had Anthropic gone forth with the DoD contract, they would have lost this top crowd, crippling the firm. On the other hand, by rejecting the contract, Anthropic's recruiting just got much easier (and OAI's much harder). Generally, the defense crowd have a somewhat inflated sense of self worth. Yes, there's a lot of money, but very few highly intelligent people want to work for them. (Almost no top talent wants to work for Palantir, despite the pay.) So, naturally: - If OpenAI becomes a glorified military contractor, they will bleed talent. - Top talent's low trust in the government means Manhattan Project-style collaborations are dead in the water. As such, AGI will likely emerge from a private enterprise effort that is not heavily militarized. Finally, the Anthropic restrictions will last, what, 2.5 more years? They are being locked out of a narrow subset of usecases (DoD contract work only - vendors can still use it for all other work - Hegseth's reading of SCR is incorrect) and have farmed massive reputation gains for both top talent and the next administration. | ||||||||||||||||||||
| ||||||||||||||||||||