▲ | thatrandybrown 2 days ago | ||||||||||||||||
I like the idea of this and the use case, but don't love the tight coupling to openai. I'd love to see a framework for allowing BYOM. | |||||||||||||||||
▲ | Onawa 2 days ago | parent | next [-] | ||||||||||||||||
It's been 2.5 years since ChatGPT came out, and so many projects still don't allow for easy switching of the OPEN_AI_BASE_URL or affiliated parameters. There are so many inferencing libraries that serve an OpenAI-compatible API that any new project being locked in to OpenAI only is a large red flag for me. | |||||||||||||||||
| |||||||||||||||||
▲ | zecheng 2 days ago | parent | prev | next [-] | ||||||||||||||||
Yes, there is a roadmap to support more models. For now there is a in progress PR to support Anthropic models https://github.com/traceroot-ai/traceroot/pull/21 (contributed by some active open source contributors) Feel free to let us know which (open source) model or framework (VLLM etc.) you want to use :) | |||||||||||||||||
| |||||||||||||||||
▲ | ethan_smith 2 days ago | parent | prev [-] | ||||||||||||||||
Adding model provider abstraction would significantly improve adoption, especially for organizations with specific LLM preferences or air-gapped environments that can't use OpenAI. | |||||||||||||||||
|