Remix.run Logo
NeutralCrane 2 hours ago

We are just now looking into LLM Gateways and LiteLLM was one I was considering looking into. I’m curious to hear more about what makes the code quality garbage.

smcleod 3 minutes ago | parent | next [-]

I've deployed LiteLLM proxy in a number of locations and we're looking to swap it out (probably to Bifrost), we've seen many bugs with it that never should have made it to a release. Most stem from poor code quality or what I'd classify as poor development practises. It's also slow, it doesn't scale well and adds a lot of latency.

Bugs include but are not limited to multiple ways budget limits aren't enforced, parameter handling issues, configuration / state mismatches etc...

everlier an hour ago | parent | prev | next [-]

How do you like bugs where tools are not working, but only for Ollama provider and only when streaming is enabled? This is one of the real instances I had to debug with LiteLLM.

SOLAR_FIELDS 2 hours ago | parent | prev [-]

I personally had no issues using the client libs, my only complaint was that they only offer official Python ones would love to see them publish a typescript one