| ▲ | nopurpose 2 days ago | |||||||||||||||||||||||||||||||||||||||||||
My immediate question is that if all of that was on-disk data duplication, why did it affected download size? Can't small download be expanded into optimal layout on the client side? | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | braiamp 2 days ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||
It didn't. They downloaded 43 GB instead of 152 GB, according to SteamDB: https://steamdb.info/app/553850/depots/ Now it is 20 GB => 21 GB. Steam is pretty good at deduplicating data in transit from their servers. They are not idiots that will let developers/publishers eat their downstream connection with duplicated data. https://partner.steamgames.com/doc/sdk/uploading#AppStructur... | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | ender341341 2 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||
depending on how the data duplication is actually done (like texture atlasing the actual bits can be very different after image compression) it can be much harder to do rote bit level deduplication. They could potentially ship the code to generate all of those locally, but then they have to deal with a lot of extra rights/contracts to do so (proprietary codecs/tooling is super, super common in gamedev), and Also largely cause devs/publishers honestly just don't really think about it, they've been doing it as long as optical media has been prevalent (early/mid 90s) and for the last few years devs have actually been taking a look and realizing it doesn't make as much sense as it used to, especially if like in this case the majority of the time is spent on runtime generation of, or if they require a 2080 as minimum specs whats the point of optimizing for 1 low end component if most people running it are on high end systems. Hitman recently (4 years ago) did a similar massive file shrink and mentioned many of the same things. | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | ahartmetz 2 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||
Sure it can - it would need either special pre- and postprocessing or lrzip ("long range zip") to do it automatically. lrzip should be better known, it often finds significant redundancy in huge archives like VM images. | ||||||||||||||||||||||||||||||||||||||||||||