▲ | tobyjsullivan a day ago | |
I had to skim through this a couple times before I realized that I still need to run an MCP server locally. This is basically a proxy between an LLM and the proposed protocol. It’s a nice proof of concept. And makes sense that the goal would be for LLM clients to adopt and support the standard natively. Then the proxy won’t be necessary. | ||
▲ | koolala a day ago | parent [-] | |
ChatGPT couldn't websocket connect to it directly in your browser? |