Remix.run Logo
physicles 5 hours ago

We’re in the middle of this right now. Go makes this easier: there’s a go CLI command that you can use to list a package’s dependencies, which can be cross-referenced with recent git changes. (duplicating the dependency graph in another build tool is a non-starter for me) But there are corner cases that we’re currently working through.

This, and if you want build + deploy that’s faster than doing it manually from your dev machine, you pay $$$ for either something like Depot, or a beefy VM to host CI.

A bit more work on those dependency corner cases, along with an auto-sleeping VM, should let us achieve nirvana. But it’s not like we have a lot of spare time on our small team.

GeneralMayhem 4 hours ago | parent [-]

Go with Bazel gives you a couple options:

* You can use gazelle to auto-generate Bazel rules across many modules - I think the most up to date usage guide is https://github.com/bazel-contrib/rules_go/blob/master/docs/g....

* In addition, you can make your life a lot easier by just making the whole repo a single Go module. Having done the alternate path - trying to keep go.mod and Bazel build files in sync - I would definitely recommend only one module per repo unless you have a very high pain tolerance or actually need to be able to import pieces of the repo with standard Go tooling.

> a beefy VM to host CI

Unless you really need to self-host, Github Actions or GCP Cloud Build can be set up to reference a shared Bazel cache server, which lets builds be quite snappy since it doesn't have to rebuild any leaves that haven't changed.