▲ | vntok 6 days ago | |||||||
Open WebUI works perfectly fine with llama.cpp though. They have very detailed quick start docs on it: https://docs.openwebui.com/getting-started/quick-start/start... | ||||||||
▲ | wkat4242 6 days ago | parent [-] | |||||||
Oh thanks I didn't know that :O I do also need an API server though. The one built into OpenWebUI is no good because it always reloads the model if you use it first from the web console and then run an API call using the same model (like literally the same model from the workspace). Very weird but I avoid it for that reason. | ||||||||
|