▲ | gs17 5 days ago | |||||||
Yeah, I tried it in Copilot and it's fast, but I'd rather have a 2x smarter model that takes 10x longer. The competition for "fast" is the existing autocomplete model, not the chat models. | ||||||||
▲ | dmix 5 days ago | parent [-] | |||||||
Why wouldn't you want the option for both? I haven't used Copilot in a while but Cursor lets you easily switch the model depending on what you're trying to do. Having options for thinking, normal, fast covers every sort of problem. GPT-5 doesn't let you choose which IMO is only helpful for non-IDE type integrations, although even in ChatGPT it can be annoying to get "thinking" constantly for simple questions. | ||||||||
|