| ▲ | andy_xor_andrew 3 days ago | |
> This is a 30B parameter MoE with 3B active parameters Where are you finding that info? Not saying you're wrong; just saying that I didn't see that specified anywhere in the linked page, or on their HF. | ||
| ▲ | gardnr 29 minutes ago | parent | next [-] | |
I was wrong. I confused this with their open model. Looking at it more closely, it is likely an omni version of Qwen3-235B-A22B. I wonder why they benchmarked it against Qwen2.5-Omni-7B instead of Qwen3-Omni-30B-A3B. I wish I could delete the comment. | ||
| ▲ | plipt 3 days ago | parent | prev [-] | |
The link[1] at the top of their article to HuggingFace goes to some models named Qwen3-Omni-30B-A3B that were last updated in September. None of them have "Flash" in the name. The benchmark table shows this Flash model beating their Qwen3-235B-A22B. I dont see how that is possible if it is a 30B-A3B model. I don't see a mention of a parameter count anywhere in the article. Do you? This may not be an open weights model. This article feels a bit deceptive | ||