Remix.run Logo
colordrops a day ago

What's the best you can do hosting an LLM locally for under $X dollars. Let's say $5000. Is there a reference guide online for this? Is there a straight answer or does it depend? I've looked at Nvidia spark and high end professional GPUs but they all seem to have serious drawbacks.

teaearlgraycold a day ago | parent | next [-]

I’m cheating your budget a bit, but for $5600 you can get an M3 Ultra with 256GB of RAM.

cft a day ago | parent | prev [-]

https://www.reddit.com/r/LocalLLaMA/

colordrops a day ago | parent [-]

That's nice, thank you, I've joined and will follow. They don't seem to have a wiki or about page that synthesizes the current state of the art though.