| ▲ | Meneth 9 hours ago |
| Some years ago I thought, no one would be stupid enough to download 100+ megabytes in their build script (which runs on CI whenever you push a commit). Then I learned about Docker. |
|
| ▲ | eleveriven 6 hours ago | parent | next [-] |
| It's like, once it's in a container, people assume it's magic and free |
|
| ▲ | jve 8 hours ago | parent | prev | next [-] |
| Wait, docker caches layers, you don't have to rebuild everything from scratch all the time... right? |
| |
| ▲ | kevincox 6 hours ago | parent | next [-] | | It does if you are building on the same host with preserved state and didn't clean it. There are lots of cases where people end up with with an empty docker repo at every CI run or regularly empty the repo because docker doesn't have any sort of intelligence space management (like LRU). | |
| ▲ | the8472 4 hours ago | parent | prev [-] | | To get fine-grained caching you need to use cache-mounts, not just cache layers. But the cache export doesn't include cache mounts, therefore the docker github action doesn't export cache mounts to the CI cache. https://github.com/moby/buildkit/issues/1512 |
|
|
| ▲ | shim__ 8 hours ago | parent | prev [-] |
| That's why I build project specific images in CI to be used in CI. Running apt-get every single time takes too damn long. |
| |
| ▲ | Havoc 7 hours ago | parent [-] | | Alternatively you can use a local cache like AptCacherNg |
|