| ▲ | falloutx 7 hours ago |
| Opencode is much better anyway and it doesnt change its workflow every couple weeks. |
|
| ▲ | CyberShadow 4 hours ago | parent | next [-] |
| PSA - please ensure you are running OpenCode v1.1.10 or newer: https://news.ycombinator.com/item?id=46581095 |
|
| ▲ | rglullis 7 hours ago | parent | prev | next [-] |
| I signed up to Claude Pro when I figured out I could use it on opencode, so I could start things on Sonnet/Opus on plan mode and switch to cheaper models on build mode. Now that I can't do that, I will probably just cancel my subscription and do the dance between different hosted providers during plan phase and ask for a prompt to feed into opencode afterwards. |
| |
| ▲ | exitb 6 hours ago | parent [-] | | As of yesterday OpenAI seems to explicitly allow opencode on their subscription plans. | | |
| ▲ | rglullis 5 hours ago | parent | next [-] | | Yeah, but that would mean me giving money to Sam Altman, and that ain't happening. | |
| ▲ | akmarinov 6 hours ago | parent | prev | next [-] | | Also GPT 5.2 is better than slOpus | | | |
| ▲ | hdra 4 hours ago | parent | prev [-] | | can you point me to this claim? also last i checked trying to connect to OpenAI seems to prompt for an API key, does openAI's API key make use of the subscription quota? just wanted to make sure before I sign up for a openAI sub | | |
|
|
|
| ▲ | fgonzag 7 hours ago | parent | prev | next [-] |
| Yeah, honestly this is a bad move on anthropic's part. I don't think their moat is as big as they think it is. They are competing against opencode + ACP + every other model out there, and there are quite a few good ones (even open weight ones). Opus might be currently the best model out there, and CC might be the best tool out of the commercial alternatives, but once someone switches to open code + multiple model providers depending on the task, they are going to have difficulty winning them back considering pricing and their locked down ecosystem. I went from max 20x and chatgpt pro to Claude pro and chat gpt plus + open router providers, and I have now cancelled Claude pro and gpt plus, keeping only Gemini pro (super cheap) and using open router models + a local ai workstation I built using minimax m2.1 and glam 4.7. I use Gemini as the planner and my local models as the churners. Works great, the local models might not be as good as opus 4.5 or sonnet 4.7, but they are consistent which is something I had been missing with all commercial providers. |
| |
| ▲ | whimsicalism 6 hours ago | parent | next [-] | | disagree. it is much better for anthropic to bundle than to become 'just another model provider' to opencode/other routers. as a consumer, i do absolutely prefer the latter model - but i don't think that is the position I would want to be in if I were anthropic. | | |
| ▲ | behnamoh 5 hours ago | parent [-] | | Nah, Anthropic thinks they have a moat; this is classic Apple move, but they ain't Apple. | | |
| ▲ | whimsicalism 3 hours ago | parent [-] | | they do have a moat. opus is currently much better than every other model except maybe gpt-5.2 |
|
| |
| ▲ | oblio 6 hours ago | parent | prev | next [-] | | > I went from max 20x and chatgpt pro to Claude pro and chat gpt plus + open router providers, and I have now cancelled Claude pro and gpt plus, keeping only Gemini pro (super cheap) and using open router models + a local ai workstation I built using minimax m2.1 and glam 4.7. I use Gemini as the planner and my local models as the churners. Works great, the local models might not be as good as opus 4.5 or sonnet 4.7, but they are consistent which is something I had been missing with all commercial providers. You went from a 5 minute signup (and 20-200 bucks per month) to probably weeks of research (or prior experience setting up workstations) and probably days of setup. Also probably a few thousand bucks in hardware. I mean, that's great, but tech companies are a thing because convenience is a thing. | | |
| ▲ | fgonzag 3 hours ago | parent | next [-] | | My first switch was to open code + open router. I used it to try mixing models for different tasks and to try open weights models before committing to the hardware. Even paying API pricing it was significantly cheaper than the nearly $500 I was paying monthly (I was spending about $100 month combined between Claude pro, chat gpt plus, and open router credits). Only when I knew exactly the setup I wanted locally did I start looking at hardware. That part has been a PITA since I went with AMD for budget reasons and it looks like I'll be writing my own inference engine soon, but I could have gone with Nvidia and had much less issues (for double the cost, dual Blackwell's vs quad Radeon W7900s for 192GB of VRAM). If you spend twice what I did and go Nvidia you should have nearly no issues running any models. But using open router is super easy, there are always free models (grok famously was free for a while), and there are very cheap and decent models. All of this doesn't matter if you aren't paying for your AI usage out of pocket. I was so Anthropics and OpenAIs value proposition vs basically free Gemini + open router or local models is just not there for me. | |
| ▲ | falloutx 6 hours ago | parent | prev [-] | | On opencode you can use models which are free for unlimited use and you can pick models which only cost like $15 a month for unlimited use. |
| |
| ▲ | Scarbutt 6 hours ago | parent | prev [-] | | a local ai workstation Peak HN comment |
|
|
| ▲ | behnamoh 5 hours ago | parent | prev | next [-] |
| I like how I can cycle through agents in OpenCode using tab. In CC all my messages get interpreted by the "main" agent; so summoning a specific agent still wastes main agent's tokens. In OpenCode, I can tab and suddenly I'm talking to a different agent; no more "main agent" bs. |
| |
| ▲ | fourthark 5 minutes ago | parent [-] | | Maybe not quite as simple but you can save and /resume lots of sessions in CC and switch between them quickly. |
|
|
| ▲ | whimsicalism 7 hours ago | parent | prev [-] |
| i find cursor cli significantly better than opencode right now, unfortunately. e: for those downvoting, i would earnestly like to hear your thoughts. i want opencode and similar solutions to win. |