▲ | thimabi 3 days ago | |||||||
Oh, well, ChatGPT is being left in the dust… When done correctly, having one million tokens of context window is amazing for all sorts of tasks: understanding large codebases, summarizing books, finding information on many documents, etc. Existing RAG solutions fill a void up to a point, but they lack the precision that large context windows offer. I’m excited for this release and hope to see it soon on the UI as well. | ||||||||
▲ | OutOfHere 3 days ago | parent [-] | |||||||
Fwiw, OpenAI does have a decent active API model family of GPT-4.1 with a 1M context. But yes, the context of the GPT-5 models is terrible in comparison, and it's altogether atrocious for the GPT-5-Chat model. The biggest issue in ChatGPT right now is a very inconsistent experience, presumably due to smaller models getting used even for paid users with complex questions. | ||||||||
|