Remix.run Logo
cogman10 5 days ago

This is where you'd start for local: https://ollama.com/

You can, almost, convert the number of nodes to gb of memory needed. For example, Deepseek-r1:7b needs about 7gb of memory to run locally.

Context window matters, the more context you need, the more memory you'll need.

If you are looking for AI devices at $2500, you'll probably want something like this [1]. A unified memory architecture (which will mean LPDDR5) will give you the most memory for the least amount of money to play with AI models.

[1] https://frame.work/products/desktop-diy-amd-aimax300/configu...