Remix.run Logo
htrp 3 days ago

Control your own inference endpoints.

its_down_again 3 days ago | parent [-]

Could you explain more on how to do this? e.g if I am using the Claude API in my service, how would you suggest I go about setting up and controlling my own inference endpoint?

handfuloflight 3 days ago | parent | next [-]

You can't. He means by using the open source models.

datavirtue 3 days ago | parent | prev [-]

Runa local LLM tuned for coding on LM Studio. It has a server and provides endpoints.