| ▲ | 2ndorderthought 2 hours ago | |||||||||||||||||||||||||
Just run a local model or run deepseek from another provider with a policy you like. The models are open weight and widely available. Still cheaper than chatgpt and anything else through 3rd parties | ||||||||||||||||||||||||||
| ▲ | yehosef an hour ago | parent [-] | |||||||||||||||||||||||||
this is the pitch - it's open source, run it yourself. But >99% of people will not have the hardware needed to run these models at a high enough quality to be close to SOTA. So they will run the open-source models on CCP systems for a good price. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||