▲ | mg 4 days ago | ||||||||||||||||||||||||||||||||||||||||
No, I'm looking for an html page with a button "Select LLM". After pressing that button and selecting a local LLM from disk, it would show an input field where you can type your question and then it would use the given LLM to create the answer. I'm not sure what OpenWebUI is, but if it was what I mean, they would surely have the page live and not ask users to install Docker etc. | |||||||||||||||||||||||||||||||||||||||||
▲ | tmdetect 4 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||
I think what you want is this: https://github.com/mlc-ai/web-llm | |||||||||||||||||||||||||||||||||||||||||
▲ | bravetraveler 4 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||
It's both what you want and not; the chat/question interface is as you describe, lack-of-installation is not. The LLM work is offloaded to other software, not the browser. I would like to skip maintaining all this crap, though: I like your approach | |||||||||||||||||||||||||||||||||||||||||
▲ | Jemaclus 4 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||
You should install it, because it's exactly what you just described. Edit: From a UI perspective, it's exactly what you described. There's a dropdown where you select the LLM, and there's a ChatGPT-style chatbox. You just docker-up and go to town. Maybe I don't understand the rest of the request, but I can't imagine a software where a webpage exists and it just magically has LLMs available in the browser with no installation? | |||||||||||||||||||||||||||||||||||||||||
|