Remix.run Logo
energy123 3 days ago

A warning about ChatGPT Pro. The 128k tokens context claim is deceptive advertising.

Messages above ~65k tokens are rejected. Messages between about 50k-65k are accepted, but the right-side of the text is pruned before the LLM call is made. Messages just below ~50k are accepted, but are then partly "forgot" on any follow up questions (either the entire first prompt is excluded, or the left-side of the text is chopped off).

Realistically, it's a 55-65k token limit (40k token question, 15k token response).

They want you to attach your context so they can use RAG.

I can't even be bothered filing a bug report, because I know this shit is intentional. The mistakes always run in a favorable direction.

(GPT-5-Pro is a genuinely good model however, and usage limits are generous)