| ▲ | stingraycharles a day ago | |||||||
I don’t understand what you’re saying. What’s preventing you from using eg OpenRouter to run a query against Kimi-K2 from whatever provider? | ||||||||
| ▲ | hu3 a day ago | parent | next [-] | |||||||
and you'll get a faster model this way | ||||||||
| ▲ | bgwalter a day ago | parent | prev [-] | |||||||
Because you have Cloudflare (MITM 1), Openrouter (MITM 2) and finally the "AI" provider who can all read, store, analyze and resell your queries. EDIT: Thanks for downvoting what is literally one of the most important reasons for people to use local models. Denying and censoring reality does not prevent the bubble from bursting. | ||||||||
| ||||||||