| ▲ | ddarolfi 4 hours ago | |
Qwen 4.6 36B? Do they mean Qwen3.6-35B-A3B? | ||
| ▲ | trvz 3 hours ago | parent | next [-] | |
Yes. The author is really sloppy if that wasn’t clear from the article. | ||
| ▲ | Johnny_Bonk 4 hours ago | parent | prev | next [-] | |
So I have a RTX 3080 10GB VRAM which I've been using with Qwen2.5 Coder and Gemma 4 E2B. Im wondering what models you have tried with which quants. | ||
| ▲ | mikeatlas 4 hours ago | parent | prev [-] | |
yes | ||