| ▲ | ssalka 2 hours ago | |
Personally, I would not run LM Studio anywhere outside of my local network as it still doesn't support adding an SSL cert. I guess you can just layer a proxy server on top of it, but if it's meant to be easy to set up, it seems like a quick win that I don't see any reason not to build support for. https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues/1... | ||
| ▲ | dmd an hour ago | parent | next [-] | |
Adding Caddy as a proxy server is literally one line in Caddyfile, and I trust Caddy to do it right once more than I trust every other random project to add SSL. | ||
| ▲ | jermaustin1 an hour ago | parent | prev | next [-] | |
Because adding a caddy/nginx/apache + letsencrypt is a couple of bash commands between install and setup, and those http servers + TLS termination is going to be 100x better than what LMS adds themselves, as it isn't their core competency. | ||
| ▲ | makeramen an hour ago | parent | prev | next [-] | |
Tailscale serve | ||
| ▲ | Nijikokun an hour ago | parent | prev [-] | |
thats why i use caddy or ngrok.ai | ||