| ▲ | Show HN: OpenBridge – turn web chat access into an OpenAI-compatible endpoint | |
| 3 points by linuz 8 hours ago | ||
I built OpenBridge: a local bridge that lets agents and tools talk to models through the web chats you already have access to, using a standard OpenAI-style API. The idea is simple: if you can use a model from a browser chat, OpenBridge exposes that access as a local endpoint, so tools like OpenCode, OpenClaw, PI, or anything else that speaks the OpenAI format can use it too. How it works: * runs locally * uses your existing authenticated web session * translates requests and responses into a standard OpenAI-compatible interface * supports normal chat flows and tool-style interactions I’ve already used it to build multiple complex apps with it, including through OpenClaw, and it’s been great. --- The motivation is bigger than just saving tokens. I think inference should be as accessible as possible. If a person already has access to a model, they should be able to use that access from their own tools, on their own machine, without being forced into a separate paid API path just to automate legitimate personal workflows. If your main concern is protecting the ToS of billion-dollar AI companies that ingested the open web at massive scale and now charge users for access to models trained on it, then this project is not for you. https://github.com/uncensoredcode/openbridge And yeah, this post was also written using OpenBridge. Cheers, Linuz | ||