▲ | imiric 4 days ago | |||||||
> I am not sure there is significant enough market for those. How so? The prosumer local AI market is quite large and growing every day, and is much more lucrative per capita than the gamer market. Gamers are an afterthought for GPU manufacturers. NVIDIA has been neglecting the segment for years, and is now much more focused on enterprise and AI workloads. Gamers get marginal performance bumps each generation, and side effect benefits from their AI R&D (DLSS, etc.). The exorbitant prices and performance per dollar are clear indications of this. It's plain extortion, and the worst part is that gamers accepted that paying $1000+ for a GPU is perfectly reasonable. > This segment really does not need even 32GB, let alone 64GB or more. 4K is becoming a standard resolution, and 16GB is not enough for it. 24GB should be the minimum, and 32GB for some headroom. While it's true that 64GB is overkill for gaming, it would be nice if that would be accessible at reasonable prices. After all, GPUs are not exclusively for gaming, and we might want to run other workloads on them from time to time. While I can imagine that VRAM manufacturing costs are much higher than DRAM costs, it's not unreasonable to conclude that NVIDIA, possibly in cahoots with AMD, has been artificially controlling the prices. While hardware has always become cheaper and more powerful over time, for some reason, GPUs buck that trend, and old GPUs somehow appreciate over time. Weird, huh. This can't be explained away as post-pandemic tax and chip shortages anymore. Frankly, I would like some government body to investigate this industry, assuming they haven't been bought out yet. Label me a conspiracy theorist if you wish, but there is precedent for this behavior in many industries. | ||||||||
▲ | Fnoord 3 days ago | parent | next [-] | |||||||
I think the timeline is roughly: SGI (90s), Nvidia gaming (with ATi and then AMD) eating that cake. Then cryptocurrency took off at the end '00s / start '10s, but if we are honest things like hashcat were also already happening. After that AI (LLMs) took off during the pandemic. During the cryptocurrency hype, GPUs were already going for insane prices and together with low energy prices or surplus (which solar can cause, but nuclear should too) allows even governments to make cheap money (and for hashcat cracking, too). If I was North Korea I'd know my target. Turns out, they did, but in a different way. That was around 2014. Add on top of this Stadia and GeForce Now as examples of renting GPU for gaming (there are more, and Stadia flopped). I didn't mention LLMs since that has been the most recent development. All in all, it turns out GPUs are more valuable than what they were sold for if your goal isn't personal computer gaming. Hence the price gone up. Now, if you want to thoroughly investigate this market you need to figure what large foreign forces (governments, businesses, and criminal enterprises) use these GPUs for. US government is aware for long time of above; hence export restrictions on GPUs. Which are meant as slowing opponent down to catch up. The opponent is the non-free world (China, North Korea, Russia, Iran, ...), though current administration is acting insane. | ||||||||
| ||||||||
▲ | rocqua 4 days ago | parent | prev [-] | |||||||
Why would intel willingly join this cartel then? Their GPU business is a slow upstart. If they have a play that could massively disrupt the competition, and has a small chance of epic failure, that should be very attractive to them. |