▲ | mkw5053 5 days ago | |
I kept finding myself having to write mini backends for LLM features in apps, if for no other reason than to keep API keys out of client code. Even with Vercel's AI SDK, you still need a (potentially serverless) backend to securely handle the API calls. I've been working on an open source LLM proxy that handles the boring stuff. Small SDK, call OpenAI or Anthropic from your frontend, proxy manages secrets/auth/limits/logs. As far as I know, this is the first way to add LLM features without any backend code at all. Like what Stripe does for payments, Auth0 for auth, Firebase for databases. It's TypeScript/Node.js with JWT auth with short-lived tokens (SDK auto-handles refresh) and rate limiting. Very limited features right now but we're actively adding more. Currently adding bring-your-own-auth (Auth0, Clerk, Firebase, Supabase) to lock down the API even more. | ||
▲ | bravesoul2 3 days ago | parent [-] | |
A way to single click install stuff like this (a moderner cPanel) would be excellent for letting non backed people deploy apps like this. I guess a bunch of yaml for each of the main PaaS services would be nearly that. |