| ▲ | algorithm314 7 hours ago |
| Here is the pricing per M tokens. https://docs.z.ai/guides/overview/pricing Why is GLM 5 more expensive than GLM 4.7 even when using sparse attention? There is also a GLM 5-code model. |
|
| ▲ | logicprog 7 hours ago | parent | next [-] |
| I think it's likely more expensive because they have more activated parameters, which kind of outweighs the benefits of DSA? |
|
| ▲ | l5870uoo9y 7 hours ago | parent | prev [-] |
| It's roughly three times cheaper than GPT-5.2-codex, which in turn reflects the difference in energy cost between US and China. |
| |
| ▲ | anthonypasq 7 hours ago | parent | next [-] | | 1. electricity costs are at most 25% of inference costs so even if electricity is 3x cheaper in china that would only be a 16% cost reduction. 2. cost is only a singular input into price determination and we really have absolutely zero idea what the margins on inference even are so assuming the current pricing is actually connected to costs is suspect. | |
| ▲ | re-thc 7 hours ago | parent | prev [-] | | It reflects the Nvidia tax overhead too. | | |
| ▲ | bigyabai 4 hours ago | parent [-] | | Not really, Western AI companies can set their margins at whatever they want. |
|
|