▲ | craftkiller 4 days ago | ||||||||||||||||
It doesn't seem exactly like what they are describing. The end-user interface is what they are describing but it sounds like they want the actual LLM to run in the browser (perhaps via webgpu compute shaders). Open WebUI seems to rely on some external executor like ollama/llama.cpp, which naturally can still be self-hosted but they are not executing INSIDE the browser. | |||||||||||||||||
▲ | Jemaclus 4 days ago | parent [-] | ||||||||||||||||
Does that even exist? It's basically what they described but with some additional installation? Once you install it, you can select the LLM on disk and run it? That's what they asked for. Maybe I'm misunderstanding something. | |||||||||||||||||
|