| ▲ | biddit 4 hours ago | |||||||||||||||||||||||||
Strongly disagree with the thesis. Everything points to commoditization of models. Open/distilled models lag behind frontier only by 6-12 months. Regulatory capture is the only thing I’m scared of with regards to tooling options and cost. | ||||||||||||||||||||||||||
| ▲ | raincole an hour ago | parent | next [-] | |||||||||||||||||||||||||
The article is obviously bad (I quitted reading after the second paragraph) but one side effect of AI training is the increasing cost of hardware. We have commoditization of models... while reversing commoditization of hardware. | ||||||||||||||||||||||||||
| ▲ | supern0va 3 hours ago | parent | prev [-] | |||||||||||||||||||||||||
>Everything points to commoditization of models. Open/distilled models lag behind frontier only by 6-12 months. Yes, but every high performing open weights model coming out of China has (supposedly) been caught distilling frontier models. It seems like a lot of people are making assumptions about the state of the open weights ecosystem based on information that may not be accurate. And if the big labs are able to reliably block distillation, we could see divergence between the two groups in terms of performance. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||