▲ | Jemaclus 4 days ago | |||||||
Does that even exist? It's basically what they described but with some additional installation? Once you install it, you can select the LLM on disk and run it? That's what they asked for. Maybe I'm misunderstanding something. | ||||||||
▲ | craftkiller 4 days ago | parent [-] | |||||||
Apparently it does, though I'm learning about it for the first time in this thread also. Personally, I just run llama.cpp locally in docker-compose with anythingllm for the UI but I can see the appeal of having it all just run in the browser.
| ||||||||
|