| ▲ | the_arun 4 hours ago | |
I understand this helps if we have our own LLM run time. What if we use external services like ChatGPT / Gemini (LLM Providers)? Shouldn't they provide this feature to all their clients out of the box? | ||
| ▲ | jmuncor 2 hours ago | parent [-] | |
This works with claude code and codex... So you can use with any of those, you dont need a local llm running... :) | ||