| ▲ | adgjlsfhk1 2 days ago | |
even ignoring distillation, so long as hardware or ml get better over time, training a new model from scratch is cheaper the later you do it | ||
| ▲ | aurareturn 5 hours ago | parent | next [-] | |
If hardware gets better over time, they also get better for OpenAI. | ||
| ▲ | ef3dfd 2 days ago | parent | prev [-] | |
Yep the poster is assuming efficiencies will not come. Absolutely they will. And this is a huge problem for OAI - given Google is targeting vertical integration, they will acquire a cost-advantage. As long as the model performance is good enough, they will kick OAI and Anthropic out in the long-run. The valuations of OAI and Anthropic are nonsense. A true valuation would incorporate failure risk, which is natural for startups/fast growing and money losing firms. Anyone who takes them serious is incredibly delusional. | ||