Remix.run Logo
adelks 6 hours ago

My current understanding is that regular consumer GPUs are very much capable if they had enough VRAM, and buying like 2 isn't out of this world.

A 64GB consumer GPU not existing has no technical nor financial (buyer side) reason to it, other than market segregation. If a 24GB one costs 1000 bucks, people are ready to pay quadruple that for a 32GB GPU. And VRAM is the main bottleneck for running larger LLMs, not performance. And some resort to offloading to RAM etc...

Segregation between pro and consumer hardware are in a big part artificial I think, for fatter margins.

Otherwise, yeah Hyperscalers have deep pockets, but they can only have it by getting back money from their users/customers, and we're gonna give them that by using their products and paying for it (indirectly or directly).

I don't think it's a good thing that only select companies gatekeep AI and it feels to me like it's going that way with chip prices.