Remix.run Logo
htrp a year ago

Control your own inference endpoints.

its_down_again a year ago | parent [-]

Could you explain more on how to do this? e.g if I am using the Claude API in my service, how would you suggest I go about setting up and controlling my own inference endpoint?

handfuloflight a year ago | parent | next [-]

You can't. He means by using the open source models.

datavirtue a year ago | parent | prev [-]

Runa local LLM tuned for coding on LM Studio. It has a server and provides endpoints.