Remix.run Logo
martinald 2 days ago

Take this with a pinch of salt, but the most recent ROCm release installed out of the box on my WSL2 machine and worked first time with llama.cpp. I even compiled llama.cpp from source with 0 issues. That has never happened ever in my 5+ years of having AMD GPUs. Every other time I've tried this it's either failed and required arcane workarounds, or just not worked entirely (including running on 'real' Linux).

I feel like finally they are turning the corner on software and drivers.

jakogut a day ago | parent [-]

Llama.cpp also has a Vulkan backend that is portable and performant, you don't need to mess with ROCm at all.

martinald a day ago | parent [-]

Oh yes I know, but "can i compile llama.cpp with rocm" has been my yardstick for how good AMD drivers are for some time.