Remix.run Logo
cortesoft 3 hours ago

It would be really hard to properly account for the training, since that won't scale with more generation.

The training is already done when you make a generative query. No matter how many consumers there are, the cost for training is fixed.

nospice 3 hours ago | parent [-]

My point is that it isn't, not really. Usage begets more training, and this will likely continue for many years. So it's not a vanishing fixed cost, but pretty much just an ongoing expenditure associated with LLMs.

bob1029 3 hours ago | parent [-]

No one doing this for money intends to train models that will never be amortized. Some will fail and some are niche, but the big ones must eventually pay for themselves or none of this works.

The economy will destroy inefficient actors in due course. The environmental and economic incentives are not entirely misaligned here.

quietbritishjim 3 hours ago | parent [-]

> No one doing this for money intends to train models that will never be amortized.

Taken literally, this is just an agreement with the comment you're replying to.

Amortizing means that it is gradually written off over a period. That is completely consistent with the ability to average it over some usage. For example, if a printing company buys a big new printing machine every 5 years (because that's how long they last before they wear out), they would amortize it's cost over the 5 years (actually it's depreciation not amortization because it's a physical asset but the idea is the same). But it's 100% possible to look at the number of documents they print over that period and calculate the price of the print machine per document. And that's still perfectly consistent with the machine paying for itself.