| ▲ | Hendrikto 2 days ago | |||||||
Well, it is currently cheaper because it is massively subsidized. That will change when subsidies stop. I don’t think it is a good argument. | ||||||||
| ▲ | ihattendorf 2 days ago | parent | next [-] | |||||||
The claim was "It is cheaper", not "It will be cheaper". Until it actually _is_ cheaper, it doesn't make much sense to purchase $10k+ in hardware to run local models that are still worse than the frontier offerings. | ||||||||
| ||||||||
| ▲ | hobofan 2 days ago | parent | prev | next [-] | |||||||
No it's not. AI products are quite often subsidized. AI inference very certainly is not. There are more and more independent AI inference providers without VC backing that serve open weight models on a ~cost-plus basis that show that subsidies are not significant for AI inference. | ||||||||
| ||||||||
| ▲ | voxleone 2 days ago | parent | prev [-] | |||||||
[dead] | ||||||||