| ▲ | mrdependable 5 hours ago | |
This is great, I've been trying to figure this stuff out recently. One thing I do wonder is what sort of solutions there are for running your own model, but using it from a different machine. I don't necessarily want to run the model on the machine I'm also working from. | ||
| ▲ | cortesoft 4 hours ago | parent | next [-] | |
Ollama runs a web server that you use to interact with the models: https://docs.ollama.com/quickstart You can also use the kubernetes operator to run them on a cluster: https://ollama-operator.ayaka.io/pages/en/ | ||
| ▲ | rebolek 4 hours ago | parent | prev [-] | |
ssh? | ||