▲ | thimabi 7 days ago | |
I’m not throwing the towel on Ollama yet. They do need dollars to operate, but still provide excellent software for running models locally and without paying them a dime. | ||
▲ | recursivegirth 7 days ago | parent [-] | |
^ this. As a developer, Ollama has been my go-to for serving offline models. I then use cloudflare tunnels to make them available where I need them. |