| ▲ | littlestymaar an hour ago | |||||||||||||
I'd be a little bit more nuanced: I think there's something off with their plans right now: it's pretty clear at this point that they can't own the technological frontier, Google is just too close already and from a purely technological PoV they are much better suited to have the best tech in the medium term. (There's no moat and Google has way more data and compute available, and also tons of cash to burn without depending on external funding). But ChatGPT is an insane brand and for most (free) customers I don't think model capabilities (aka “intelligence”) are that important. So if they stopped training frontier models right now and focus on driving their costs low by optimizing their inference compute budget while serving ads, they can make a lot of money from their user base. But that would probably mean losing most of its paying customers over the long run (companies won't be buying mediocre token at a premium for long) and more importantly it would require abandoning the AGI bullshit narrative, which I'm not sure Altman is willing to do. (And even if he was, how to do that without collapsing from lack of liquidity due to investors feeling betrayed is an open question). | ||||||||||||||
| ▲ | bloppe 4 minutes ago | parent | next [-] | |||||||||||||
The best way to drive inference cost down right now is to use TPUs. Either that or invest tons of additional money and manpower into silicon design like Google did, but they already have a 10 year lead there. | ||||||||||||||
| ▲ | riffraff 40 minutes ago | parent | prev [-] | |||||||||||||
> But ChatGPT is an insane brand I mean, so was netscape. | ||||||||||||||
| ||||||||||||||