▲ | dsp_person 6 days ago | |
So is this aimed at small models only? Is there any advantages to these models compared to what I can run locally on a 16GB VRAM GPU? Would be nice for something at the level of like Claude 3.5 | ||
▲ | Alex-Programs 5 days ago | parent [-] | |
Yeah, proper V3/R1/K2/Qwen 235B are the point at which open LLMs become worth using. |