Remix.run Logo
HDThoreaun 7 hours ago

Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference

kuschku 7 hours ago | parent | next [-]

If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.

bombolo 7 minutes ago | parent | prev [-]

[dead]