Remix.run Logo
energy123 4 days ago

A public warning about OpenAI's Plus chat subscription as of today.

They advertise 196k tokens context length[1], but you can't submit more than ~50k tokens in one prompt. If you do, the prompt goes through, but they chop off the right-hand-side of your prompt (something like _tokens[:50000]) before calling the model.

This is the same "bug" that existed 4 months ago with GPT-5.0 which they "fixed" only after some high-profile Twitter influencers made noise about it. I haven't been a subscriber for a while, but I re-subscribed recently and discovered that the "bug" is back.

Anyone with a Plus sub can replicate this by generating > 50k tokens of noise then asking it "what is 1+1?". It won't answer.

[1] https://help.openai.com/en/articles/11909943-gpt-52-in-chatg...

hu3 3 days ago | parent | next [-]

Well this explains the weird behaviour of GPT-5 often ignoring a large part of my prompt when I attatched many code/csv files despite keeping total token count under control. That is with Github Copilot inside VSCode.

The fix was to just switch to Claude 3.5 and now to 4.5 in VSCode.

wrcwill 3 days ago | parent | prev | next [-]

ugh this is so amateurish. i swear since the release of o3 this has been happening on and off.

scrollop 4 days ago | parent | prev | next [-]

And the Xhigh version is only available via API, not chatgpt.

noname120 4 days ago | parent [-]

Are you sure the that “extended thinking” option from the ChatGPT web client is something different?

4 days ago | parent | next [-]
[deleted]
seunosewa 3 days ago | parent | prev [-]

Probably high.

DANmode a day ago | parent | prev | next [-]

Is this via the API, or only webUI?

ismailmaj 3 days ago | parent | prev [-]

"Oh sorry guys, we made the mistake again that saves us X% in compute cost, we will fix it soon!"