Remix.run Logo
bjornsing 4 days ago

> CUDA has been a huge moat

The CUDA moat is extremely exaggerated for deep learning, especially for inference. It’s simply not hard to do matrix multiplication and a few activation functions here and there.

OkayPhysicist 4 days ago | parent | next [-]

It regularly shocks me that AMD doesn't release their cards with at least enough CUDA reimplementation to run DL models. As you point out, AI applications use a tiny subset of the overall API, the courts have ruled that APIs can't be protected by copyright, and CUDA is NVIDIA's largest advantage. It seems like an easy win, so I assume there's some good reason.

nerdsniper 4 days ago | parent | next [-]

A very cynical take: AMD and Nvidia CEO’s are cousins and there’s more money to be made with one dominant monopoly than two competitive companies. And this income could be an existential difference-maker for Taiwan.

kawaiikouhai 4 days ago | parent [-]

bro, both are American CEOs.

What is this racialized nonsense, have you seen Jensen Huang speak Mandarin? His mandarin is actually awful for someone who left Taiwan at 8.

tux1968 4 days ago | parent | prev | next [-]

AMD can't even figure out how to release decent drivers for Linux in a timely fashion. It might not be the largest market, but would have at least given them a competitive advantage in reaching some developers. There is either something very incompetent in their software team, or there are business reasons intentionally restraining them.

wmf 4 days ago | parent | prev [-]

They did; it's called HIP.

axoltl 4 days ago | parent | prev | next [-]

From what I've been reading the inference workload tends to ebb and flow throughout the day with much lower loads overnight than at for example 10AM PT/1PM ET. I understand companies fill that gap with training (because an idle GPU costs the most).

So for data centers, training is just as important as inference.

bjornsing 3 days ago | parent [-]

> So for data centers, training is just as important as inference.

Sure, and I’m not saying buying Nvidia is a bad bet. It’s the most flexible and mature hardware out there, and the huge installed base also means you know future innovations will align with this hardware. But it’s not primarily a CUDA thing or even a software thing. The Nvidia moat is much broader than just CUDA.

sciencesama 4 days ago | parent | prev [-]

The drivers are the most annoying issue ! Pytorch kind of like cuda so much it just works anything with roccm just sucks !