| ▲ | hn_throwaway_99 4 hours ago | |||||||||||||
Your comment is responding to an issue that is different from what GP said. GP was talking about Chinese open source particularly, i.e. their open source models, which AFAIK have consistently been keeping up with (albeit a few steps behind) the closed source OpenAI and Anthropic models. Hardware capacity is a separate issue entirely. | ||||||||||||||
| ▲ | CharlieDigital 4 hours ago | parent [-] | |||||||||||||
I mean, this sentence is self contradictory, no?
It seems like hardware capabilities are at the very heart of both training and inference which is why Nvidia, TSMC are hitting record income and capitalization. Feels like divorcing hardware from the equation is discounting a big part of winning this race. | ||||||||||||||
| ||||||||||||||