▲ | dragonwriter 5 days ago | |
> I wonder how much of the '5 release was about cutting costs vs making it outwardly better. I'm speculating that one reason they'd deprecate older models is because 5 materially cheaper to run? I mean, assuming the API pricing has some relation to OpenAI cost to provide (which is somewhat speculative, sure), that seems pretty well supported as a truth, if not necessarily the reason for the model being introduced: the models discontinued (“deprecated” implies entering a notice period for future discontinuation) from the ChatGPT interface are priced significantly higher than GPT-5 on the API. > For companies that extensively test the apps they're building (which should be everyone) swapping out a model is a lot of work. Who is building apps relying on the ChatGPT frontend as a model provider? Apps would normally depend on the OpenAI API, where the models are still available, but GPT-5 is added and cheaper. | ||
▲ | nickthegreek 5 days ago | parent | next [-] | |
> Who is building apps relying on the ChatGPT frontend as a model provider? Apps would normally depend on the OpenAI API, where the models are still available, but GPT-5 is added and cheaper. Always enjoy your comments dw, but on this one I disagree. Many non-technical people at my org use custom gpt's as "apps" to do some re-occuring tasks. Some of them have spent absurd time tweaking instructions and knowledge over and over. Also, when you create a custom gpt, you can specifically set the preferred model. This will no doubt change the behavior of those gpts. Ideally at the enterprise level, our admins would have a longer sunset on these models via web/app interface to ensure no hiccups. | ||
▲ | trashface 5 days ago | parent | prev [-] | |
Maybe the true cost of GPT-5 is hidden, I tried to use the GPT-5 API and openai wanted me to do a biometric scan with my camera, yikes. |