▲ | Jemaclus 4 days ago | |||||||||||||||||||||||||
You should install it, because it's exactly what you just described. Edit: From a UI perspective, it's exactly what you described. There's a dropdown where you select the LLM, and there's a ChatGPT-style chatbox. You just docker-up and go to town. Maybe I don't understand the rest of the request, but I can't imagine a software where a webpage exists and it just magically has LLMs available in the browser with no installation? | ||||||||||||||||||||||||||
▲ | craftkiller 4 days ago | parent | next [-] | |||||||||||||||||||||||||
It doesn't seem exactly like what they are describing. The end-user interface is what they are describing but it sounds like they want the actual LLM to run in the browser (perhaps via webgpu compute shaders). Open WebUI seems to rely on some external executor like ollama/llama.cpp, which naturally can still be self-hosted but they are not executing INSIDE the browser. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | andsoitis 4 days ago | parent | prev [-] | |||||||||||||||||||||||||
> You should install it, because it's exactly what you just described. Not OP, but it really isn't what' they're looking for. Needing to install stuff VS simply going to a web page are two very different things. |