Remix.run Logo
RockyMcNuts 5 days ago

OpenAI should probably consider:

- enabling local MCP in Desktop like Claude Desktop, not just server-side remote. (I don't think you can run a local server unless you expose it to their IP)

- having an MCP store where you can click on e.g. Figma to connect your account and start talking to it

- letting you easily connect to your own Agents SDK MCP servers deployed in their cloud

ChatGPT MCP support is underwhelming compared to Claude Desktop.

robbomacrae 4 days ago | parent | next [-]

You absolutely can make a local MCP server! I use one as part of TalkiTo which runs one in the background and connects it to Claude Code at runtime so it looks like this:

talkito: http://127.0.0.1:8000/sse (SSE)

https://github.com/robdmac/talkito/blob/main/talkito/mcp.py

Admittedly that's not as straight forward as one might hope.

Also regarding this point "letting you easily connect to your own Agents SDK MCP servers deployed in their cloud" I hear roocode has a cool new remote connect to your local machine so you can interact with roocode on your desktop from any browser.

namibj 4 days ago | parent | prev | next [-]

`tailscale serve` is easy. Set appropriate permissions/credentials to authenticate your ChatGPT to the MCP.

varenc 5 days ago | parent | prev [-]

Agreed on this. I'm still waiting for local MCP server support.

jngiam1 5 days ago | parent [-]

One way around this to use a gateway to run the local MCP - then connect to it via the gateway. We just rolled out support for that [1] in the one we're making at mintmcp.com

[1] https://www.youtube.com/watch?v=8j9CA5pCr5c