▲ | rrrrrrrrrrrryan 4 days ago | ||||||||||||||||||||||
OpenAI has already started degrading their $20/month tier by automatically routing most of the requests to the lightest free-tier models. We're very clearly heading toward a future where there will be a heavily ad-supported free tier, a cheaper (~$20/month) consumer tier with no ads or very few ads, and a business tier ($200-$1000/month) that can actually access state of the art models. Like Spotify, the free tier will operate at a loss and act as a marketing funnel to the consumer tier, the consumer tier will operate at a narrow profit, and the business tier for the best models will have wide profit margins. | |||||||||||||||||||||||
▲ | lodovic 4 days ago | parent | next [-] | ||||||||||||||||||||||
I find that hard to believe. As long as we have open weight models, people will have an alternative to these subscriptions. For $200 a month it is cheaper to buy a GPU with lots of memory or rent a private H200. No ads and no spying. At this point the subscriptions are mainly about the agent functionality and not so much the knowledge in the models themselves. | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | willcannings 4 days ago | parent | prev [-] | ||||||||||||||||||||||
Most? Almost all my requests to the "Auto" model end up being routed to a "thinking" model, even those I think ChatGPT would be able to answer fine without extra reasoning time. Never say never, but right now the router doesn't seem to be optimising for cost (at least for me), it really does seem to be selecting a model based on the question itself. |