I am running ollama as back end and open webui as front end. It handled downloading and swapping between models.
What is the llama-cpp alternative?