| ▲ | tristor 7 hours ago | |
I am very excited by this, but I am a bit dampened that the maximum memory available is 128GB. I was really hoping for 256GB, which would allow me to run frontier models locally. I think with 128GB it's still feasible to use this with something like Qwen3-Coder-Next and MiniMax-M2.5, but things like Kimi-K2.5 will require significant quantization to fit and model performance will really suffer. I'm really wanting to build proper local-first AI workflows at home, and I think Apple has an opportunity to make that possible in a way other companies aren't really focused on, but we need significantly larger memory capabilities to do it, which I know is tough in the current memory market but should be available for a cost. | ||
| ▲ | vardump 7 hours ago | parent [-] | |
Tell me about it. I checked the page thinking whether I should go for 256 GB or 512 GB RAM model. 128 GB maximum. Sigh. | ||