| ▲ | simonw 6 hours ago | |||||||
Models of this size can usually be run using MLX on a pair of 512GB Mac Studio M3 Ultras, which are about $10,000 each so $20,000 for the pair. | ||||||||
| ▲ | PlatoIsADisease an hour ago | parent [-] | |||||||
You might want to clarify that this is more of a "Look it technically works" Not a "I actually use this" The difference between waiting 20 minutes to answer the prompt '1+1=' and actually using it for something useful is massive here. I wonder where this idea of running AI on CPU comes from. Was it Apple astroturfing? Was it Apple fanboys? I don't see people wasting time on non-Apple CPUs. (Although, I did do this for a 7B model) | ||||||||
| ||||||||