Remix.run Logo
eli 5 hours ago

There are already various proxies to translate between OpenAI-style models (local or otherwise) and an Anthropic endpoint that Claude Code can talk to. Is the advantage here just one less piece of infrastructure to worry about?

g4cg54g54 4 hours ago | parent [-]

siderailing here - but got one that _actually_ works?

in particular i´d like to call claude-models - in openai-schema hosted by a reseller - with some proxy that offers anthropic format to my claude --- but it seems like nothing gets to fully line things up (double-translated tool names for example)

reseller is abacus.ai - tried BerriAI/litellm, musistudio/claude-code-router, ziozzang/claude2openai-proxy, 1rgs/claude-code-proxy, fuergaosi233/claude-code-proxy,

kristopolous 3 hours ago | parent [-]

What probably needs to exist is something like `llsed`.

The invocation would be like this

    llsed --host 0.0.0.0 --port 8080 --map_file claude_to_openai.json --server https://openrouter.ai/api
Where the json has something like

    { tag: ... from: ..., to: ..., params: ..., pre: ..., post: ...}
So if one call is two, you can call multiple in the pre or post or rearrange things accordingly.

This sounds like the proper separation of concerns here... probably

The pre/post should probably be json-rpc that get lazy loaded.

Writing that now. Let's do this: https://github.com/day50-dev/llsed