| ▲ | dilyevsky 12 hours ago | ||||||||||||||||
> I’m pretty shocked that the Bazel workflow involves downloading Docker base images from external URLs. That seems very unbazel like! That belongs in the monorepo for sure. Not every dependency in Bazel requires you to "first invent the universe" locally. Lots of examples of this like toolchains, git_repository, http_archive rules and on and on. As long as they are checksum'ed (as they are in this case) so that you can still output a reproducible artifact, I don't see the problem | |||||||||||||||||
| ▲ | carolosf 10 hours ago | parent | next [-] | ||||||||||||||||
Also it is possible to air gap bazel and provide files as long as they have the same checksum offline. | |||||||||||||||||
| ▲ | forrestthewoods 12 hours ago | parent | prev [-] | ||||||||||||||||
Everything belongs in version control imho. You should be able to clone the repo, yank the network cable, and build. I suppose a URL with checksum is kinda sorta equivalent. But the article adds a bunch of new layers and complexity to avoid “downloading Cuda for the 4th time this week”. A whole lot of problems don’t exist if they binary blobs exist directly in the monorepo and local blob store. It’s hard to describe the magic of a version control system that actually controls the version of all your dependencies. Webdev is notorious for old projects being hard to compile. It should be trivial to build and run a 10+ year old project. | |||||||||||||||||
| |||||||||||||||||