| ▲ | selcuka 6 hours ago | |
> Wikipedia is cheap compared to creating and training models. DeepSeek said it spent $5.6M [1] on training V3, which doesn't sound too much for a near-SOTA model. An open source entity can come up with a hybrid business model, such as requiring a small fee from those who want to host the model as a business for the first n months following the release of a new model, but making it fully free for individuals. | ||