| ▲ | icwtyjj 9 hours ago | |||||||||||||
A lot of discussion surrounding the ram shortage seems to imply that it will recover, but AI companies slurping up ram for training hasn't gone down and probably won't ever. Is there any signs that the situation is improving or is this just the new normal? | ||||||||||||||
| ▲ | librasteve 9 hours ago | parent | next [-] | |||||||||||||
RAM has always been a boom/bust cycle - a square wave with the period about the time it takes to bring a new state of the art fab online (3 years ish) | ||||||||||||||
| ||||||||||||||
| ▲ | thoughtpeddler 8 hours ago | parent | prev | next [-] | |||||||||||||
From what I understand, the RAM shortage is more about AI inference than AI training. Yes, training created much of the early HBM crunch because frontier-model training clusters need tons of HBM near GPUs, but inference is what is keeping the pressure on now and into the future. | ||||||||||||||
| ▲ | piskov 9 hours ago | parent | prev [-] | |||||||||||||
Chinese fabs will alleviate some of the pressure | ||||||||||||||