▲ | craftkiller 4 days ago | |
Apparently it does, though I'm learning about it for the first time in this thread also. Personally, I just run llama.cpp locally in docker-compose with anythingllm for the UI but I can see the appeal of having it all just run in the browser.
| ||
▲ | Jemaclus 4 days ago | parent [-] | |
Oh, interesting. Well, TIL. |