| ▲ | bgwalter a day ago | |
Because you have Cloudflare (MITM 1), Openrouter (MITM 2) and finally the "AI" provider who can all read, store, analyze and resell your queries. EDIT: Thanks for downvoting what is literally one of the most important reasons for people to use local models. Denying and censoring reality does not prevent the bubble from bursting. | ||
| ▲ | irthomasthomas a day ago | parent [-] | |
you can use chutes.ai TEE (Trusted Execution Environment) and Kimi K2 is running at about 100t/s rn | ||