Remix.run Logo
mindcrime 4 hours ago

> Been running local LLMs on my 7900 XTX for months and the ROCm experience has been... rough.

Just out of curiosity... how so?

I only ask because I've been running local models (using Ollama) on my RX 7900 XTX for the last year and a half or so and haven't had a single problem that was ROCm specific that I can think of. Actually, I've barely had any problems at all, other than the card being limited to 24GB of VRAM. :-(

I'm halfway tempted to splurge on a Radeon Pro board to get more VRAM, but ... haven't bitten the bullet yet.