▲ | ojosilva 6 days ago | |||||||
But, by the looks of things, models will be more efficient by then and a cheaper-to-run model will produce comparable output. At least that's how it's been with OSS models, or with the Openai api model. So maybe the inevitable price hike (or rate limiting) may lead to switching models / providers and the results being just as good. | ||||||||
▲ | racc1337 5 days ago | parent | next [-] | |||||||
There is an interesting substack post about this. LLM costs are dropping 10x/year but the amount of tokens used have gone up like crazy https://open.substack.com/pub/ethanding/p/ai-subscriptions-g... | ||||||||
▲ | krainboltgreene 6 days ago | parent | prev [-] | |||||||
> But, by the looks of things, models will be more efficient by then and a cheaper-to-run model will produce comparable output So far there's negative evidence of this. Things are getting more expensive for similar outputs. | ||||||||
|