▲ | mindcrime 5 days ago | |
Hopefully sooner than later, this will all "just work" mostly and won't be nearly the exercise in frustration for someone who hasn't been actively in the AI culture. There's definitely a lot of variation in experiences. In my case, on my box with an RX 9090 XTX, installing ROCm via apt did "just work" and I can compile and run programs against the GPU, and things like Ollama work with GPU acceleration with no weird fiddling or custom setup. And from what I hear, I'm definitely not the only person having this kind of experience. | ||
▲ | tracker1 5 days ago | parent [-] | |
A lot of source reps in GitHub should seriously update their instructions then. |