| ▲ | simonw 8 hours ago | ||||||||||||||||||||||
I think they mean that the DeepSeek API charges are less than it would cost for the electricity to run a local model. Local model enthusiasts often assume that running locally is more energy efficient than running in a data center, but fail to take the economies of scale into account. | |||||||||||||||||||||||
| ▲ | croes 4 hours ago | parent | next [-] | ||||||||||||||||||||||
Local enthusiasts don’t have to fear account banning. | |||||||||||||||||||||||
| ▲ | littlestymaar 6 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
I guess it mostly comes from using the model with batch-size = 1 locally, vs high batch size in a DC, since GPU consumption don't grow that much with batch size. Note that while a local chatbot user will mostly be using batch-size = 1, it's not going to be true if they are running an agentic framework, so the gap is going to narrow or even reverse. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | jacquesm 6 hours ago | parent | prev [-] | ||||||||||||||||||||||
Some of those local model enthusiasts can actually afford solar panels. | |||||||||||||||||||||||
| |||||||||||||||||||||||