▲ | ghc 10 hours ago | |||||||
Obviously you don't need to train new models to operate existing ones. I think I trust the semianalysis estimate ($250M) more than this estimate ($2B), but who knows? I do see my revenue estimate was for this year, though. However, $4B revenue on $250M COGS...is still staggeringly good. No wonder amazon, google, and Microsoft are tripping over themselves to offer these models for a fee. | ||||||||
▲ | singron 8 hours ago | parent | next [-] | |||||||
You need to train new models to advance the knowledge cutoff. You don't necessarily need to R&D new architectures, and maybe you can infuse a model with new knowledge without completely training from scratch, but if you do nothing the model will become obsolete. Also the semianalysis estimate is from Feb 2023, which is before the release of gpt4, and it assumes 13 million DAU. ChatGPT has 800 million WAU, so that's somewhere between 115 million and 800 million DAU. E.g. if we prorate the cogs estimate for 200 DAU, then that's 15x higher or $3.75B. | ||||||||
| ||||||||
▲ | hamburga 10 hours ago | parent | prev | next [-] | |||||||
But assuming no new models are trained, this competitive effect drives down the profit margin on the current SOTA models to zero. | ||||||||
| ||||||||
▲ | dvfjsdhgfv 4 hours ago | parent | prev [-] | |||||||
> Obviously you don't need to train new models to operate existing ones. For a few months, maybe. Then they become obsolete and, in some cases like coding, useless. |