| ▲ | mcdeltat 11 hours ago |
| > Compressing data means you save space on the disc... If you conveniently ignore the fact that common.lin is duplicated in each map's directory and is the same for every map I tested, which kinda negates part of this. This is an interesting thing I've noticed about game dev, it seems to sometimes live in a weird space of optimisation requirements vs hackiness. Where you'll have stuff like using instruction data as audio to save space, but then forget to compile in release mode or something. Really odd juxtaposition of near-genius-level optimisation with naive inefficiency. I'm assuming it's because, while there may be strict performance requirements, the devs are under the pump and there's so much going on that silly stuff ends up happening? |
|
| ▲ | bargainbin 6 hours ago | parent | next [-] |
| Exactly that - once it’s shipped it’s shipped. Doesn’t matter if the code is “clean” or “maintainable” or whatever. The longer it’s not released for sale, the more debt you’re incurring paying the staff. I’ve worked with a few ex-game devs and they’re always great devs, specifically at optimising. They’re not great at the “forward maintainability” aspect though because they’ve largely never had experience having to do it. |
|
| ▲ | richardfey 10 hours ago | parent | prev | next [-] |
| This might be an optimisation to avoid disc seeks on wildly far apart distances, which would introduce more latency. |
| |
| ▲ | landr0id 9 hours ago | parent | next [-] | | For this file in particular I'm unsure. common.lin is a separate file which I believe is supposed to contain data common to all levels _before_ the level is loaded. There's a single exported object that all levels of the game have called `MyLevel`. The game attempts to load this and it triggers a load of the level data and all its unique dependencies. The common.lin file is a snapshot of everything read before this export. AFAIK this is deterministic so it should be the exact same across all maps but I've not tested all levels. When loading a level, the training level for instance contains two distinct parts. Part 1 of the map loads 0_0_2_Training.lin, and the second part loads 0_0_3_Training.lin. These parts are completely independent -- loading the second part does not require loading the first. It does a complete re-launch of the game using the Xbox's XLaunchNewImage API, so all prior memory I think should be evicted but maybe there's some flag I'm unaware of. That is to say, I'm fairly confident they are mutually exclusive. So basically the game launches, looks In the "Training" map folder for common.lin, opens a HANDLE, then looks for whichever section it's loading, grabs a HANDLE, then starts reading common.lin and <map_part>.lin. There's multiple parts, but only one common.lin in each map folder. So no matter what it's not going to be laid out in a contiguous disc region for common.lin leading into <map_part>.lin. Part 1 may be right after common.lin, but if you're loading any other part you'll have to make a seek. I don't know enough about optical media seek times to say if semi-near locality is noticeably better for the worst case than the files being on complete opposite sector ranges of the disc. | | |
| ▲ | richardfey 8 hours ago | parent | next [-] | | They were doing this kind of optical media seek times tests/optimisations for PS1 games, like Crash Bandicoot.
You certainly have more and better context than me on this console/game, I just mentioned it in case it wasn't considered. By the way, could the nonsensical offsets be checksums instead? Nice reverse engineering work and analysis there! | | |
| ▲ | ralferoo 8 hours ago | parent [-] | | IIRC the average seek time across optical media is around 120ms, so ideally you want all reads to be linear. I remember one game I worked on, I spent months optimising loading, especially boot flow, to ensure that every file the game was going to load was the very next file on the disk, or else the next file was an optionally loaded file that could be skipped (as reading and ignoring was quicker than seeking). For the few non-deterministic cases where order couldn't be predicted (e.g. music loaded from a different thread), I preloaded a bunch of assets up front so that the rest of the assets were deterministic. One fun thing we often did around this era is eschew filenames and instead hash the name. If we were loading a file directly from C code, we'd use the preprocessor the hash the code via some complicated macros, so the final call would be compiled like LoadAsset(0x184e49da) but still retain a run-time hasher for cases where the filename was generated dynamically. This seems like a weird optimisation, but actually avoiding the directory scan and filename comparisons can save a lot of unnecessary seeking / CPU operations, especially for multi-level directories. The "file table" then just became a list of disk offset and lengths, with a few gaps because the hash table size was a little bigger than the number of files to avoid hash conflicts. Ironically, on one title I worked on we had the same modulo for about 2 years in development, and just before launch we needed to change it twice in a week due to conflicts! | | |
| ▲ | rswail 6 hours ago | parent [-] | | This reminds me of Mel: Mel's job was to re-write
the blackjack program for the RPC-4000.
(Port? What does that mean?)
The new computer had a one-plus-one
addressing scheme,
in which each machine instruction,
in addition to the operation code
and the address of the needed operand,
had a second address that indicated where, on the revolving drum,
the next instruction was located.
https://users.cs.utah.edu/~elb/folklore/mel.html |
|
| |
| ▲ | oarsinsync 7 hours ago | parent | prev [-] | | ISO9660 has support for something that resembles hard links - IE, a file can exist in multiple places in the directory structure, but always point to the same underlying data blocks on disc. I think XISO is derived from ISO9660, so may have the same properties? |
| |
| ▲ | Cthulhu_ 4 hours ago | parent | prev [-] | | Definitely could be a factor; I know of a programmer who works at a Dutch company that mainly does ports of AAA games (he may be on here too, hello!), he once wrote a comment or forum post about how he developed an algorithm to put data on a disk in the order that it was needed to minimize disk seeks. Spinny disks benefit greatly from reading data linearly. |
|
|
| ▲ | qingcharles 2 hours ago | parent | prev | next [-] |
| As a previous game dev. It was a combination of that. Also, you're often starting a project on tooling that is bleeding edge and that you have no experience with and isn't properly tested or documented. Then game dev was always full of fresh junior devs with tons of energy, ideas and dreams, but who are coming from home brew where things like reliable, beautiful, readable code are unnecessary. And tons of things get missed. I keep hoping that the one published game I have was accidentally built with debug symbols in it so it can be easily traced. Two of us on the project were heavily into performance optimization, and I absolutely remember us going through compiler options, but things were crazy. I remember one major milestone build for Eidos I was hallucinating badly when I compiled and burned the CD because I'd been working for three days straight with no sleep. |
|
| ▲ | rusk 10 hours ago | parent | prev | next [-] |
| There was a running theme in mythic quest about the engineers sweating over the system while monetisation just went bolted on a casino. Also happened in GTA5 [0] there was a ridiculous loading glitch that was quite well documented on here a while ago. Also a monetisation bolt on. So you have competing departments one of whom must justify itself by producing a heavily after my system. And another one which is licensed to generate revenue at any cost…… [0] https://news.ycombinator.com/item?id=26296339 |
| |
| ▲ | ramses0 3 hours ago | parent | next [-] | | There was one similar issue with DOOM framerate, I'm assuming an intern got tasked with adding the "blink the LED on the fancy mouse" code (due to a marketing partnership) and it absolutely _trashes_ the framerate! https://old.reddit.com/r/Doom/comments/bnsy4o/psa_deactivate... | | | |
| ▲ | avereveard 9 hours ago | parent | prev [-] | | There's also relative pain scales Loading happen once per session and is less painful than frame stuttering all game, for example, so given a tight deadline one would get prioritized over the other | | |
| ▲ | Orygin 9 hours ago | parent | next [-] | | I tried playing GTAO when it was free, and oh boy. Loading for 10 minutes, arrive into the game and see you're not with your friends. So 10 more minutes to load into their server. Then you start a mission and 10 more minutes of loading. The server disconnected? 10 minutes load to go back without your friend. Join your friend? guessed it: 10 more minutes of loading.
For a billion dollar game, it's insane I spent more time loading than playing the game. Imagine how many more $$ they could have gotten if players could double their play time. | | | |
| ▲ | GranPC 9 hours ago | parent | prev | next [-] | | Loading in GTA Online absolutely does not happen once per session. It happens before and after every mission and activity. I am not sure whether it's a full load/was also affected by that bug, but I can certainly tell you that around 20% of my GTAO "playtime" consisted of staring at a load screen. | |
| ▲ | 9 hours ago | parent | prev [-] | | [deleted] |
|
|
|
| ▲ | monero-xmr 10 hours ago | parent | prev [-] |
| And passion to deliver. Engineers will kill themselves for a game release for no extra money and far less salary than their abilities would demand at a bigcorp. But they love it so they do it, and hack as best they can, to get their art into the world. |
| |
| ▲ | qingcharles an hour ago | parent [-] | | I bought into that scam. My dream job was video game dev. I spent my whole childhood writing video games. It was how I ended up doing it professionally for near zero money and 80 hour weeks and burning out completely in two years. |
|