Remix.run Logo
viraptor 2 days ago

> by putting every asset a level (for example) needs in the same file, you can pretty much guarantee you can read it all sequentially

I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.

toast0 2 days ago | parent | next [-]

From that paper, table 4, large files had an average # of fragments around 100, but a median of 4 fragments. A handful of fragments for a 1 GB level file is probably a lot less seeking than reading 1 GB of data out of a 20 GB aggregated asset database.

But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.

dontlaugh 2 days ago | parent | prev | next [-]

Sure. I’ve seen people that do packaging for games measure various techniques for hard disks typical of the time, maybe a decade ago. It was definitely worth it then to duplicate some assets to avoid seeks.

Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.

wcoenen 2 days ago | parent | prev | next [-]

> But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower.

By default, Windows automatically defragments filesystems weekly if necessary. It can be configured in the "defragment and optimize drives" dialog.

pixl97 2 days ago | parent [-]

Not 'full' de-fragmentation, Microsoft labs did a study and after 64MB slabs of contiguous files you don't gain much so they don't care about getting gigabytes fully defragmented.

https://web.archive.org/web/20100529025623/http://blogs.tech...

old article on the process

justsomehnguy 2 days ago | parent | prev | next [-]

> But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower

Someone installing a 150GB game sure do have 150GB+ of free space and there would be a lot of continuous free space.

jayd16 2 days ago | parent | prev [-]

It's an optimistic optimization so it doesn't really matter if the large blobs get broken up. The idea is that it's still better than 100k small files.