| ▲ | lowbloodsugar 2 days ago |
| You run a 671B model at home? |
|
| ▲ | segmondy 2 days ago | parent | next [-] |
| Yes, and plenty of others do too. Quantizied. Join us at r/localllama My largest models 318G /llmzoo/models/Qwen3.5-397B
377G DeepSeekv3.2-nolight
380G /llmzoo/models/DeepSeek-V3.2-UD
400G /llmzoo/models/Qwen3.5-397B-Q8
443G DeepSeek-Math-v2
443G DeepSeek-V3-0324-Q5
522G /llmzoo/models/GLM5.1
545G /llmzoo/models/kimi2.6
546G /llmzoo/models/KimiK2.5
|
| |
|
| ▲ | tclancy 2 days ago | parent | prev | next [-] |
| It's a big house. |
|
| ▲ | UncleOxidant a day ago | parent | prev | next [-] |
| Maybe if there was a 1-bit quant. |
| |
| ▲ | barbacoa 19 hours ago | parent [-] | | Apple briefly was selling Mac studio with 512 GB of unified ram, meaning all that was available as vram. |
|
|
| ▲ | 2 days ago | parent | prev [-] |
| [deleted] |