| ▲ | dist-epoch 8 hours ago | |
There are really nice GUIs for LLMs - CherryStudio for example, can be used with local or cloud models. There are also web-UIs - just like the labs ones. And you can connect coding agents like Codex, Copilot or Pi to local coding agents - the support OpenAI compatible APIs. It's literally a terminal command to start serving the model locally and you can connect various things to it, like Codex. | ||