▲ | trilogic 5 days ago | |||||||||||||
I have to disagree. The biggest cost is still energy consumption, water and maintenance. Not to mention, to keep up with the rivals in incredibly high tempo (so offering billions like Meta recently). Then the cost of hardware that is equal to Nvidia skyrocketing shares :) No one should dare to talk about profit yet. Now is time to grab the market, invest a lot and work hard, hopping for a future profit. The equation is still work on progress. | ||||||||||||||
▲ | jsnell 5 days ago | parent | next [-] | |||||||||||||
The capital costs for the GPU are an order of magnitude larger than the energy consumption. It doesn't matter whether the GPUs are used for training or inference. Back of the envelope: $25k GPU amortized over 5 years is $5k/year. A 500W GPU run at full power uses 4.5MWh; at $0.15/kWh the electricity costs $650/year. The other operating costs you suggest have to be even smaller. | ||||||||||||||
▲ | DoesntMatter22 5 days ago | parent | prev | next [-] | |||||||||||||
Is that not baked into the h100 rental costs? | ||||||||||||||
| ||||||||||||||
▲ | wtallis 5 days ago | parent | prev [-] | |||||||||||||
> The biggest cost is still energy consumption, water and maintenance. Are you saying that the operating costs for inference exceed the costs of training? | ||||||||||||||
|