| ▲ | phendrenad2 5 hours ago | |
I don't really understand why AI providers don't charge like the electric company, or AWS. Instead of increasing usage limits, just charge less for off-hours use. | ||
| ▲ | lxgr 5 hours ago | parent [-] | |
LLM inference is much more geographically fungible than electricity, so maybe it’s just not worth the complexity yet and there is enough (not highly latency sensitive) load on average globally. | ||