| ▲ | layer8 7 hours ago | |
From the recent-ish Dwarkesh podcast, Anthropic seems to be wary about buying/building too much compute [0]. That probably means that they have to attempt to minimize compute usage when there is a surge in demand. Following the argument in the podcast, throwing more money after them, as some in this thread are suggesting, won’t solve the issue, at least not in the short term. [0] https://www.dwarkesh.com/i/187852154/004620-if-agi-is-immine... | ||
| ▲ | shdh 30 minutes ago | parent [-] | |
Likely accurate This tends to happen during pretraining phase of new models Happened with 3.x too | ||