| ▲ | kshri24 a day ago |
| Game development is STILL a highly underrated field. Plenty of advancements/optimizations (both in software/hardware) can be directly traced back to game development. Hopefully, with RAM prices shooting up the way it is, we go back to keeping optimizations front and center and reduce all the bloat that has accumulated industry wide. |
|
| ▲ | hinkley 18 hours ago | parent | next [-] |
| A number of my tricks are stolen from game devs and applied to boring software. Most notably, resource budgets for each task. You can’t make a whole system fast if you’re spending 20% of your reasonable execution time on one moderately useful aspect of the overall operation. |
|
| ▲ | ksec 11 hours ago | parent | prev | next [-] |
| I think one could even say gaming as a sector single handedly move most of the personal computing platform forward since 80s and 90s. Before that it was probably Military and cooperate. From DOS era, overclocking CPU to push benchmarks, DOOM, 3D Graphics API from 3DFx Glide to Direct X. Faster HDD for faster Gaming Load times. And for 10 - 15 years it was gaming that carried CUDA forward. |
|
| ▲ | abustamam a day ago | parent | prev [-] |
| Yes please! Stop making me download 100+gb patches! |
| |
| ▲ | ffsm8 a day ago | parent [-] | | The large file sizes are not because of bloat per-se... It's a technique which supposedly helped at one point in time to reduce loading times, helldiver's being the most note-able example of removing this "optimization". However, this is by design - specifically as an optimization. Can't really be calling that boat in the parents context of inefficient resource usage | | |
| ▲ | flohofwoe 16 hours ago | parent | next [-] | | This was the the reason in Helldivers, other games have different reasons - like uncompressed audio (which IIRC was the reason for the CoD-install-size drama a couple of years back) - the underlying reason is always the same though, the dev team not caring about asset size (or more likely: they would like to take care of it but are drowned in higher priority tasks). | |
| ▲ | thanksgiving 21 hours ago | parent | prev | next [-] | | We aren't talking about the initial downloads though. We are talking about updates. I am like 80% sure you should be able to send what changed without sending the whole game as if you were downloading it for the first time. | | |
| ▲ | SirAiedail 17 hours ago | parent | next [-] | | Helldiver's engine does have that capability, where bundle patches only include modified files and markers for deleted files.
However, the problem with that, and likely the reason Arrowhead doesn't use it, is the lack of a process on the target device to stitch them together. Instead, patch files just sit next to the original file.
So the trade-off for smaller downloads is a continuously increasing size on disk. | |
| ▲ | ffsm8 20 hours ago | parent | prev | next [-] | | from my understanding of the technique youre wrong despite being 80% sure ;) any changes to the code or textures will need the same preprocessing done. large patch size is basically 1% of changes + 99% all the preprocessed data for this optimization | | |
| ▲ | laggyluke 19 hours ago | parent [-] | | How about incorporating postprocessing into the update procedure instead of preprocessing? |
| |
| ▲ | thunderfork 8 hours ago | parent | prev [-] | | Generally "small patches" and "well-compressed assets" are on either end of a trade-off spectrum. More compression means large change amplification and less delta-friendly changes. More delta-friendly asset storage means storing assets in smaller units with less compression potential. In theory, you could have the devs ship unpacked assets, then make the Steam client be responsible for packing after install, unpacking pre-patch, and then repacking game assets post-patch, but this basically gets you the worst of all worlds in terms of actual wall clock time to patch, and it'd be heavily constraining for developers. |
| |
| ▲ | SkiFire13 18 hours ago | parent | prev | next [-] | | Do you have some resource for people outside this field to understand what it's about? | | |
| ▲ | baobun 17 hours ago | parent [-] | | It goes all the way back to tapes, was still important for CDs, and still thought relevant for HDDs. Basically you can get much better read performance if you can read everything sequentially and you want to avoid random access at all costs. So you can basically "hydrate" the loading patterns for each state, storing the bytes in order as they're loaded from the game. The only point it makes things slower is once, on download/install. Of course the whole excercise is pointless if the game is installed to an HDD only because of its bigger size and would otherwise be on an nvme ssd... And with still affordable 2TB nvme drives it doesn't make as much sense anymore. | | |
| ▲ | SkiFire13 7 hours ago | parent | next [-] | | So this basically leads to duplicating data for each state it's needed in? If that's the case I wonder why this isn't solvable by compressing the update download data (potentially with the knowledge of the data already installed, in case the update really only reshuffles it around) | |
| ▲ | tremon 14 hours ago | parent | prev [-] | | It's also a valid consideration in the context of streaming games -- making sure that all resources for the first scene/chapter are downloaded first allows the player to begin playing while the rest of the resources are still downloading. | | |
|
| |
| ▲ | abustamam 10 hours ago | parent | prev [-] | | Interesting, today I learned! |
|
|