Remix.run Logo
creddit 2 hours ago

Gemini 3 is crushing my personal evals for research purposes.

I would cancel my ChatGPT sub immediately if Gemini had a desktop app and may still do so if it continues to impress my as much as it has so far and I will live without the desktop app.

It's really, really, really good so far. Wow.

Note that I haven't tried it for coding yet!

ethmarks 2 hours ago | parent [-]

Genuinely curious here: why is the desktop app so important?

I completely understand the appeal of having local and offline applications, but the ChatGPT desktop app doesn't work without an internet connection anyways. Is it just the convenience? Why is a dedicated desktop app so much better than just opening a browser tab or even using a PWA?

Also, have you looked into open-webui or Msty or other provider-agnostic LLM desktop apps? I personally use Msty with Gemini 2.5 Pro for complex tasks and Cerebras GLM 4.6 for fast tasks.

creddit an hour ago | parent [-]

I have a few reasons for the preference:

(1) The ability to add context via a local apps integration into OS level resources is big. With Claude, eg, I hit Option-SPC which brings up a prompt bar. From there, taking a screenshot that will get sent my prompt is as simple as dragging a bounding box. This is great. Beyond that, I can add my own MCP connectors and give my desktop app direct access to relevant context in a way that doesn't work via web UI. It may also be inconvenient to give context to a web UI in some case where, eg, I may have a folder of PDFs I want it to be able to reference.

(2) Its own icon that I can CMD-TAB to is so much nicer. Maybe that works with a PWA? Not really sure.

(3) Even if I can't use an LLM when offline, having access to my chats for context has been repeatedly valuable to me.

I haven't looked at provider-agnostic apps and, TBH, would be wary of them.

ethmarks 44 minutes ago | parent [-]

> The ability to add context via a local apps integration into OS level resources is big

Good point. I can see why integrated support for local filesystem tools would be useful, even though I prefer manually uploading specific files to avoid polluting the context with irrelevant info.

> Its own icon that I can CMD-TAB to is so much nicer

Fair enough. I personally prefer Firefox's tab organization to my OS's window organization, but I can see how separating the LLM into its own window would be helpful.

> having access to my chats for context has been repeatedly valuable to me.

I didn't at all consider this. Point ceded.

> I haven't looked at provider-agnostic apps and, TBH, would be wary of them.

Interesting. Why? Is it security? The ones I've listed are open source and auditable. I'm confident that they won't steal my API keys. Msty has a lot of advanced functionality that I haven't seen in other interfaces like allowing you to compare responses between different LLMs, export the entire conversation to Markdown, and edit the LLM's response to manage context. It also sidesteps the problem of '[provider] doesn't have a desktop app' because you can use any provider API.