| ▲ | vibe42 18 hours ago | ||||||||||||||||
Higher-end gaming laptops are still decently priced and work well for local AI inference. And Linux runs better than ever on them; I'm running debian 13 with almost no driver issues. For $2k you can get 32 GB DDR5 RAM and 16 GB fast VRAM. Bump the RAM to 64 GB and you're still below $3k. | |||||||||||||||||
| ▲ | solstice 17 hours ago | parent [-] | ||||||||||||||||
What models or classes of models would I be able to run on that hardware? I've asked myself that question while looking at some of the models on this: site https://laptopparts4less.frl/index.php?route=common/home | |||||||||||||||||
| |||||||||||||||||