Remix.run Logo
tasuki 4 hours ago

> The big kicker for GLM for me is I can use it in Pi, or whatever harness I like.

Yes, but... isn't the same true for Opus and all the other models too?

slopinthebag 4 hours ago | parent [-]

Opus is about 7 times more expensive than GLM with API pricing. And since you can only use the Opus subscription plan in CC, you're essentially locked into API pricing for Pi and any other harness.

So you're either paying $1000's for Opus in Pi, or $30/month for GLM in Pi. If the results are mostly equivalent that's an easy choice for most of us.

tasuki 3 hours ago | parent [-]

Perhaps I'm being extremely daft: If the API is 7 times more expensive, then why is it $1000 vs $30? Or is there a GLM subscription one can use with Pi? Certainly not available in my (arguably outdated) Pi.

RussianCow 3 hours ago | parent | next [-]

I'm not the OP, but it's the latter. I'm currently using the "Lite" GLM subscription with OpenCode, for example. I'm not using it very heavily, but I haven't come close to hitting the limits, whereas I burned through my weekly limits with Claude very regularly.

girvo 3 hours ago | parent | prev [-]

You can use GLM’s coding plan in Pi, just use the anthropic API instead of the OpenAI compatible one they give.

probst 3 hours ago | parent [-]

Or tell pi to add support for the coding plan directly. That gave me GLM-5.1 support in no time along with support for showing the remaining quota, etc, too.

It also compresses the context at around 100k tokens.

In case anyone is interested: https://github.com/sebastian/pi-extensions/tree/main/.pi/ext...