| ▲ | Show HN: Run LLMs in Docker for any language without prebuilding containers(github.com) | |||||||||||||
| 12 points by mheap 4 days ago | 5 comments | ||||||||||||||
I've been looking for a way to run LLMs safely without needing to approve every command. There are plenty of projects out there that run the agent in docker, but they don't always contain the dependencies that I need. Then it struck me. I already define project dependencies with mise. What if we could build a container on the fly for any project by reading the mise config? I've been using agent-en-place for a couple of weeks now, and it's working great! I'd love to hear what y'all think | ||||||||||||||
| ▲ | sshine 2 hours ago | parent | next [-] | |||||||||||||
| ||||||||||||||
| ▲ | KolmogorovComp an hour ago | parent | prev | next [-] | |||||||||||||
Thanks for sharing, works well | ||||||||||||||
| ▲ | verdverm an hour ago | parent | prev [-] | |||||||||||||
I've been working on something similar, but geared towards integration with VS Code. Builds on CUE + Dagger via https://github.com/hofstadter-io/hof/tree/_next/examples/env This allows you to not only run current commands in a containerized environment, but also any point in history > .go-version I would not call this idiomatic Go, you can get the version my project requires from my go.mod file. Imo, a single file with the inputs your tool needs would be preferable to a bunch of files with a single line, but ideally it can infer that on it's own by looking at the language files that already exist | ||||||||||||||