▲ | jascha_eng 3 months ago | |||||||||||||||||||||||||||||||
Hmm I like the idea of providing a unified interface to all LLMs to interact with outside data. But I don't really understand why this is local only. It would be a lot more interesting if I could connect this to my github in the web app and claude automatically has access to my code repositories. I guess I can do this for my local file system now? I also wonder if I build an LLM powered app, and currently simply to RAG and then inject the retrieved data into my prompts, should this replace it? Can I integrate this in a useful way even? The use case of on your machine with your specific data, seems very narrow to me right now, considering how many different context sources and use cases there are. | ||||||||||||||||||||||||||||||||
▲ | jspahrsummers 3 months ago | parent | next [-] | |||||||||||||||||||||||||||||||
We're definitely interested in extending MCP to cover remote connections as well. Both SDKs already support an SSE transport with that in mind: https://modelcontextprotocol.io/docs/concepts/transports#ser... However, it's not quite a complete story yet. Remote connections introduce a lot more questions and complexity—related to deployment, auth, security, etc. We'll be working through these in the coming weeks, and would love any and all input! | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | mike_hearn 3 months ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Local only solves a lot of problems. Our infrastructure does tend to assume that data and credentials are on a local computer - OAuth is horribly complex to set up and there's no real benefit to messing with that when local works fine. | ||||||||||||||||||||||||||||||||
▲ | TeMPOraL 3 months ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
I'm honestly happy with them starting local-first, because... imagine what it would look like if they did the opposite. > It would be a lot more interesting if I could connect this to my github in the web app and claude automatically has access to my code repositories. In which case the "API" would be governed by a contract between Anthropic and Github, to which you're a third party (read: sharecropper). Interoperability on the web has already been mostly killed by the practice of companies integrating with other companies via back-channel deals. You are either a commercial partner, or you're out of the playground and no toys for you. Them starting locally means they're at least reversing this trend a bit by setting a different default: LLMs are fine to integrate with arbitrary code the user runs on their machine. No need to sign an extra contact with anyone! | ||||||||||||||||||||||||||||||||
▲ | bryant 3 months ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
> It would be a lot more interesting if I could connect this to my github in the web app and claude automatically has access to my code repositories. From the link: > To help developers start exploring, we’re sharing pre-built MCP servers for popular enterprise systems like Google Drive, Slack, GitHub, Git, Postgres, and Puppeteer. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | singularity2001 3 months ago | parent | prev [-] | |||||||||||||||||||||||||||||||
For me it's complementary to openai's custom GPTs which are non-local. |