Remix.run Logo
singron a day ago

You need to train new models to advance the knowledge cutoff. You don't necessarily need to R&D new architectures, and maybe you can infuse a model with new knowledge without completely training from scratch, but if you do nothing the model will become obsolete.

Also the semianalysis estimate is from Feb 2023, which is before the release of gpt4, and it assumes 13 million DAU. ChatGPT has 800 million WAU, so that's somewhere between 115 million and 800 million DAU. E.g. if we prorate the cogs estimate for 200 DAU, then that's 15x higher or $3.75B.

ghc a day ago | parent [-]

> You need to train new models to advance the knowledge cutoff

That's a great point, but I think it's less important now with MCP and RAG. If VC money dried up and the bubble burst, we'd still have broadly useful models that wouldn't be obsolete for years. Releasing a new model every year might be a lot cheaper if a company converts GPU opex to capex and accepts a long training time.

> Also the semianalysis estimate is from Feb 2023,

Oh! I missed the date. You're right, that's a lot more expensive. On the other hand, inference has likely gotten a lot cheaper (in terms of GPU TOPS) too. Still, I think there's a profitable business model there if VC funding dries up and most of the model companies collapse.