Remix.run Logo
rockbruno 3 hours ago

When the AI companies run out of money, I predict tokens will stop being dirt cheap and such setups will become extremely expensive (even for regular software engineering to some extent). Then it's become clear how over-engineered most things we do with AI are

skeledrew 2 hours ago | parent | next [-]

> tokens will stop being dirt cheap

That can't be allowed, and also won't happen. If token costs do start going up at a serious rate in the US, you can be sure that they'll stay down in China, and the political situation won't allow for the inevitable exodus to Chinese providers.

patrickk 3 hours ago | parent | prev [-]

In parallel, local models are getting better and better, so eventually they’ll get “good enough” to run fairly cheaply at a level close to the current Sonnet/Opus models (what I run Claudeclaw with), on Groq, Openrouter or whatever commodity provider. Perhaps even mid to high end consumer PCs when the current RAM madness subsides.

There’s loads of good discussions about local LLMs in this thread:

https://news.ycombinator.com/item?id=47190997

2 hours ago | parent [-]
[deleted]