▲ | johncolanduoni 3 days ago | |
I guess the answer might be hyperscalers that want cheap electricity to run LLM inference. They’re already throwing tens of billions at AI, what’s a few billion more to have a chance at super cheap energy for their new data centers? |