| ▲ | vunderba 5 hours ago | |||||||
FWIW, Ollama already does most of this: - Cross-platform - Sets up a local API server The tradeoff is a somewhat higher learning curve, since you need to manually browse the model library and choose the model/quantization that best fit your workflow and hardware. OTOH, it's also open-source unlike LMStudio which is proprietary. | ||||||||
| ▲ | randallsquared 4 hours ago | parent [-] | |||||||
I assumed from the name that it only ran llama-derived models, rather than whatever is available at huggingface. Is that not the case? | ||||||||
| ||||||||