| ▲ | behnamoh 4 hours ago | |
> I mean, what's the point of using local models if you can't trust the app itself? and you think ollama doesn't do telemetry/etc. just because it's open source? | ||
| ▲ | thehamkercat 4 hours ago | parent [-] | |
That's why i suggested using llama.cpp in my other comment. | ||