Remix.run Logo
mort96 2 days ago

The negativity towards this is wild. A company followed relatively widely accepted industry practice (lots and lots of other games also have huge sizes on disk for the exact same reason), then eventually they decided to do their own independent testing to check whether said common practice actually makes things better or not in their case, found that it didn't, so they reversed it. In addition, they wrote up some nice technical articles on the topic, helping to change the old accepted industry wisdom.

This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?

zamadatix 2 days ago | parent | next [-]

Arrowhead probably deserves more love for breaking the norm but I think it's overshadowed by people finding out for the first time the reason HDDs are so common in gaming setups is companies have been blindly shaving a few seconds off HDD load time off at the cost of 7x the disk space.

If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.

abtinf 2 days ago | parent | next [-]

Part of what makes this outrageous is that the install size itself is probably a significant part of the reason to install the game on an HDD.

154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?

KronisLV a day ago | parent | next [-]

> 154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

I think War Thunder did it the best:

  * Minimal client 23 GB
  * Full client 64 GB
  * Ultra HQ ground models 113 GB
  * Ultra HQ aircraft 92 GB
  * Full Ultra HQ 131 GB
For example, I will never need anything more than the full client, whereas if I want to play on a laptop, I won't really need more than the minimal client (limited textures and no interiors for planes).

The fact that this isn't commonplace in every engine and game out there is crazy. There's no reason why the same approach couldn't also work for DLCs and such. And there's no reason why this couldn't be made easy in every game engine out there (e.g. LOD level 0 goes to HQ content bundle, the lower ones go into the main package). Same for custom packages for like HDDs and such.

consp 2 days ago | parent | prev | next [-]

Can any games these days be reliably ran on hdd's with max 200mb/s throughout (at best)? Or does everyone get a coffee and some cookies when a new zone loads? Even with this reduction that will take a while.

I thought all required ssd's now for "normal" gameplay.

kbolino 2 days ago | parent [-]

Until you get to super-high-res textures and the like, the throughput isn't nearly as important as the latency.

At 200 MB/s the way hard drives usually measure it, you're able to read up to 390,625 512-byte blocks in 1 second, or to put it another way, a block that's immediately available under the head can be read in 2.56 microseconds. On the other hand, at 7200 RPM, it takes up to 8.33 milliseconds to wait for the platter to spin around and reach a random block on the same track. Even if these were the only constraints, sequentially arranging data you know you'll need to have available at the same time cuts latency by a factor of about 3000.

It's much harder to find precise information about the speed of the head arm, but it also usually takes several milliseconds to move from the innermost track to the outermost track or vice versa. In the worst case, this would double the random seek time, since the platter has to spin around again because the head wasn't in position yet. Also, since hard drives are so large nowadays, the file system allocators actually tend to avoid fragmentation upfront, leading to generally having few fragments for large files (YMMV).

So, the latency on a hard drive can be tolerable when optimized for.

wtallis 2 days ago | parent [-]

> On the other hand, at 7200 RPM, it takes up to 138 microseconds to wait for the platter to spin around and reach a random block on the same track.

You did the math for 7200 rotations per second, not 7200 rotations per minute = 120 rotations per second.

In gaming terms, you get at most one or two disk reads per frame, which effectively means everything has to be carefully prefetched well in advance of being needed. Whereas on a decade-old SATA SSD you get at least dozens of random reads per frame.

kbolino 2 days ago | parent [-]

Fixed!

jayd16 2 days ago | parent | prev [-]

"Self fulfilling prophecy" perhaps?

nopurpose 2 days ago | parent | prev [-]

My immediate question is that if all of that was on-disk data duplication, why did it affected download size? Can't small download be expanded into optimal layout on the client side?

braiamp 2 days ago | parent | next [-]

It didn't. They downloaded 43 GB instead of 152 GB, according to SteamDB: https://steamdb.info/app/553850/depots/ Now it is 20 GB => 21 GB. Steam is pretty good at deduplicating data in transit from their servers. They are not idiots that will let developers/publishers eat their downstream connection with duplicated data.

https://partner.steamgames.com/doc/sdk/uploading#AppStructur...

myself248 2 days ago | parent [-]

Furthermore, this raises the possibility of a "de-debloater" that HDD users could run, which would duplicate the data into its loading-optimized form, if they decided they wanted to spend the space on it. (And a "de-de-debloater" to recover the space when they're not actively playing the game...)

The whole industry could benefit from this.

nomel 2 days ago | parent [-]

> to recover the space when they're not actively playing the game

This would defeat the purpose. The goal of the duplication is to place the related data physically close, on the disk. Hard links, removing then replacing, etc, wouldn't preserve the physical spacing of the data, meaning the terrible slow read head has to physically sweep around more.

I think the sane approach would be to have a HDD/SDD switch for the file lookups, with all the references pointing to the same file, for SDD.

myself248 a day ago | parent [-]

So you'd have to defrag after re-bloating, to make all the files contiguous again. That tool already exists, and the re-bloater could just call it.

nomel a day ago | parent [-]

Sure, but defrag is a very slow process, especially if you're re-bloating (since it requires shifting things to make space), and definitely not something that could happen in the background, as the player is playing. Re-bloating definitely wouldn't be good for a quick "Ok, I'm ready to play!".

myself248 2 hours ago | parent [-]

I imagine it'd be equivalent to a download task, just one that doesn't consume bandwidth.

ender341341 2 days ago | parent | prev | next [-]

depending on how the data duplication is actually done (like texture atlasing the actual bits can be very different after image compression) it can be much harder to do rote bit level deduplication. They could potentially ship the code to generate all of those locally, but then they have to deal with a lot of extra rights/contracts to do so (proprietary codecs/tooling is super, super common in gamedev), and

Also largely cause devs/publishers honestly just don't really think about it, they've been doing it as long as optical media has been prevalent (early/mid 90s) and for the last few years devs have actually been taking a look and realizing it doesn't make as much sense as it used to, especially if like in this case the majority of the time is spent on runtime generation of, or if they require a 2080 as minimum specs whats the point of optimizing for 1 low end component if most people running it are on high end systems.

Hitman recently (4 years ago) did a similar massive file shrink and mentioned many of the same things.

ahartmetz 2 days ago | parent | prev [-]

Sure it can - it would need either special pre- and postprocessing or lrzip ("long range zip") to do it automatically. lrzip should be better known, it often finds significant redundancy in huge archives like VM images.

Night_Thastus 2 days ago | parent | prev | next [-]

It would be one thing if it was a 20% increase in space usage, or if the whole game was smaller to start with, or if they had actually checked to see how much it assisted HDD users.

But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.

The negativity is frustration boiling over from years of a bad technical state for the game.

I do appreciate them making the right choice now though, of course.

teamonkey 2 days ago | parent | next [-]

It was a choice, not an oversight. They actively optimised for HDD users, because they believed that failing to do so could impact load times for both SSD and HDD users. There was no speed penalty in doing so for SSD users, just a disk usage penalty.

Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.

mort96 2 days ago | parent [-]

You make a million decisions in the beginning of every project. I'm certain they made the choice to do this "optimization" at an early point (or even incidentally copied the choice over from an earlier project) at a stage where the disk footprint was small (a game being 7GB when it could've been 1GB doesn't exactly set off alarm bells).

Then they just didn't reconsider the choice until, well, now.

teamonkey 2 days ago | parent [-]

Even at the end of development it’s a sensible choice. It’s the default strategy for catering to machines with slow disk access. The risk of some players experiencing slow load times is catastrophic at launch. In absence of solid user data, it’s a fine assumption to make.

brokenmachine a day ago | parent | next [-]

Call me a dinosaur, but I don't consider a 154Gb download before I can start playing a good first impression.

In fact, I would seriously consider even buying a game that big if I knew beforehand. When a 500Gb SSD is $120 Aussie bucks, that's $37 of storage.

XorNot 2 days ago | parent | prev [-]

The first impression matters is the thing. This was John Carmacks idea on how to sell interlacing to smartphone display makers for VR: the upsell he had was that there's one very important moment when a consumer sees a new phone: they pick it up, open something and flick it and that scroll effect better be a silky smooth 60 FPS or more or there's trouble. (His argument was making that better would be a side effect of what he really wanted).

colechristensen 2 days ago | parent | prev [-]

>But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

Have you never worked in an organization that made software?

Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.

zamadatix 2 days ago | parent | next [-]

I think what makes this a bit different from the usual "time/value tradeoff" discussion is bloating the size by 6x-7x was the result of unnecessary work in the name of optimization instead of lack of cycles to spend on optimization.

mort96 2 days ago | parent [-]

Eh probably not, it's probably handled by some automated system when making release builds of the game. Sure, implementing that initially was probably some work (or maybe it was just checking a checkbox in some tool), but there's probably not much manual work involved anymore to keep it going.

Reverting it now though, when the game is out there on a million systems, requires significant investigation to ensure they're not making things significantly worse for anyone, plus a lot of testing to make sure it doesn't outright break stuff.

zamadatix 2 days ago | parent [-]

Reverting it now was certainly a pile of work, but that's neither here nor there for the portion of the story bothering people. It's like they threw rocks threw the windows years ago to make them slightly clearer to see through and now put a ton of work in to undo that because they discovered that made no sense in reality.

It's great they did all the work to fix it after the fact, but that doesn't justify why it was worth throwing rocks through the window in the first place (which is different than not doing optimizations).

ozgrakkurt 2 days ago | parent | prev | next [-]

This is not a reason for accepting it imo

mywittyname 2 days ago | parent [-]

Optimization takes up time, and often it takes up the time of an expert.

Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.

But what is worse, is just trying to optimize software is not the same as successfully optimizing it. So time and money spent on optimization might yield no results because there might not be anymore efficiency to be gained, the person doing the work lacks the technical skill, the gains are part of a tradeoff that cannot be justified, or the person doing the work can't make a change (i.e., a 3rd party library is the problem).

The lack of technical skill is a big one, IMO. I'm personally terrible at optimizing code, but I'm pretty good at building functional software in a short amount of time. We have a person on our team who is really good at it and sometimes he'll come in after me to optimize work that I've done. But he'll spend several multiples of the time I took making it work and hammering out edge cases. Sometimes the savings is worth it.

kappaking 2 days ago | parent [-]

> Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.

God why can’t it just be longer development time. I’m sick of the premature fetuses of games.

Cyphusx 2 days ago | parent | next [-]

The trade off they're talking about is to arrive at the same end product.

The reason games are typically released as "fetuses" is because it reduces the financial risk. Much like any product, you want to get it to market as soon as is sensible in order to see if it's worth continuing to spend time and money on it.

mort96 2 days ago | parent [-]

And this really shouldn't surprise professionals in an industry where everything's always about development velocity and releasing Minimum Viable Products as quickly into the market as possible.

maccard 2 days ago | parent | prev | next [-]

> God why can’t it just be longer development time.

Where do you stop? What do the 5 tech designers do while the 2 engine programmers optimise every last byte of network traffic?

> I’m sick of the premature fetuses of games.

Come on, keep this sort of crap off here. Games being janky isn't new - look at old console games and they're basically duct taped together. Go back to Half-life 1 in 1998 - the Xen world is complete and utter trash. Go back farther and you have stuff that's literally unplayable [0], or things that were so bad they literally destroyed an entire industry [1], or rendered the game uncompleteable [2].

[0] https://en.wikipedia.org/wiki/Dr._Jekyll_and_Mr._Hyde_(video... [1] https://www.theguardian.com/film/2015/jan/30/a-golden-shinin... [2] https://www.reddit.com/r/gamecollecting/comments/hv63ad/comm...

colechristensen 2 days ago | parent [-]

Super Mario 64, widely recognized as one of the most iconic influential games ever... was released with a build that didn't have the compiler optimizations turned on. They proved this by decompiling it and with the exact right compiler and tools recompiling it with the non-optimized arguments. Recompiling with the optimizations turned on resulted in no problems and significant performance boosts.

One of the highest rated games ever released without devs turning on the "make it faster" button which would have required approximately zero effort and had zero downsides.

This kind of stuff happens because the end result A vs. B doesn't make that much of a difference.

And it's very hard to have a culture of quality that doesn't get overrun by zealots who will bankrupt you while they squeeze the last 0.001% of performance out of your product before releasing. It is very had to have a culture of quality that does the important things and doesn't do the unimportant ones.

The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.

A very fine ability for evaluating quality mixed with pragmatic choice for what and when to spend time on it is rare.

maccard a day ago | parent [-]

> The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.

I think this is a little harsh and I’d rephrase the second half to “the people Who obsess with releasing make games”.

unusualmonkey 2 days ago | parent | prev [-]

Just wait until after launch. You get a refined experience and often much lower prices.

thaumasiotes 2 days ago | parent | prev [-]

But this isn't an optimization. The 150+GB size is the "optimization", one that never actually helped with anything. The whole news here is "Helldivers 2 stopped intentionally screwing its customers".

I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.

bigstrat2003 2 days ago | parent [-]

> The whole news here is "Helldivers 2 stopped intentionally screwing its customers".

That is an extremely disingenuous way to frame the issue.

thaumasiotes 2 days ago | parent [-]

How so?

nearbuy 2 days ago | parent | prev | next [-]

This is a mischaracterization of the optimization. This isn't a standard optimization that games apply everywhere. It's an optimization for spinning disks that some games apply sometimes. They're expected to measure if the benefits are worth the cost. (To be clear, bundling assets is standard. Duplicating at this level is not.)

This doesn't advance accepted industry wisdom because:

1. The trade-off is very particular to the individual game. Their loading was CPU-bound rather than IO-bound so the optimization didn't make much difference for HDDs. This is already industry wisdom. The amount of duplication was also very high in their game.

2. This optimization was already on its way out as SSDs take over and none of the current gen consoles use HDDs.

I'm not mad at Arrowhead or trying to paint them negatively. Every game has many bugs and mishaps like this. I appreciate the write-up.

somat 2 days ago | parent | prev | next [-]

At one point, I think it was TitanFall2, the pc port of a game deliberately converted it's audio to uncompressed wav files in order to inflate the install size, They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.

ryandrake 2 days ago | parent | next [-]

> When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.

mghackerlady 2 days ago | parent [-]

Maybe that'll start to change since ram is the new gold and who knows what the AI bubble will eat next

maccard 2 days ago | parent | prev | next [-]

> They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.

ycombinatrix 2 days ago | parent [-]

>The files would just be compressed for distribution and decompressed on download

All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

duskwuff 2 days ago | parent | next [-]

> The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

Even when Titanfall 2 was released in 2016, I don't think that was meaningfully the case. Audio compression formats have been tuned heavily for efficient playback.

maccard a day ago | parent | next [-]

It's easy to apply todays standards. Titanfall was released 11 years ago, and ran on an Xbox 360, and a Core 2 Duo. MP3 was a patent encumbered format. There's a fun DF article [0] where they say:

> Titanfall accesses Microsoft's existing cloud network, with servers spooling up on demand. When there's no demand, those same servers will service Azure's existing customers. Client-side, Titanfall presents a dedicated server experience much like any other but from the developer and publisher perspective, the financials in launching an ambitious online game change radically.

Things changed _massively_ in games between 2014 and 2017 - we went from supporting borderline embedded level of platforms with enormous HW constraints, architecture differences, and running dedicated servers like the 90's, to basically supporting fixed spec PCs, and shipping always online titles running on the cloud.

[0] https://www.digitalfoundry.net/articles/digitalfoundry-2014-...

nearbuy 2 days ago | parent | prev | next [-]

Uncompressed audio is typically used for sound effects, while music is compressed. Latency is the primary benefit. Uncompressed audio will play immediately while an mp3 will have a few frames delay. Sounds like gunshots or footsteps are typically short files anyway, so the increased memory usage isn't that painful.

Games also can stack many sounds, so even if the decoding cost is negligible when playing a single sound, it'll be greater if you have 32 sounds playing at once.

duskwuff 2 days ago | parent [-]

> Uncompressed audio will play immediately while an mp3 will have a few frames delay.

I'm not sure what you mean by this. Encoding latency is only relevant when you're dealing with live audio streams - there's no delay inherent to playing back a recorded sound.

> Sounds like gunshots or footsteps are typically short files anyway, so the increased memory usage isn't that painful.

Not all sound effects are short (consider e.g. loops for ambient noise!), and the aggregate file size for uncompressed audio can be substantial across an entire game.

nearbuy 2 days ago | parent [-]

> there's no delay inherent to playing back a recorded sound.

There absolutely is. You can decompress compressed audio files when loading so they play immediately, but if you want to keep your mp3 compressed, you get a delay. Games keep the sound effects in memory uncompressed.

> Not all sound effects are short

Long ambient background noises often aren't latency sensitive and can be streamed. For most games textures are the biggest usage of space and audio isn't that significant, but every game is different. I'm just telling you why we use uncompressed audio. If there is a particular game you know of that's wasting a lot of space on large audio files, you should notify the devs.

There is a reason both Unity and Unreal use uncompressed audio or ADPCM for sound effects.

justsomehnguy 2 days ago | parent [-]

> but if you want to keep your mp3 compressed, you get a delay

If that really bothers you then write your own on-disk compression format.

> why we use uncompressed audio

> ADPCM

... which is a compressed and lossy format.

nearbuy 2 days ago | parent | next [-]

> If that really bothers you then write your own on-disk compression format.

Why? What are you trying to solve here? You're going to have a hard time making a new format that serves you better than any of the existing formats.

The most common solution for instant playback is just to store the sound uncompressed in memory. It's not a problem that needs solving for most games.

ADPCM and PCM are both pretty common. ADPCM for audio is kinda like DXT compression for textures: a very simple compression that produces files many times larger than mp3, and doesn't have good sound quality, but has the advantage that playback and seek costs virtually nothing over regular PCM. The file sizes of ADPCM are closer to PCM than mp3. I should have been clearer in my first comment that the delay is only for mp3/Vorbis and not for PCM/ADPCM.

There isn't a clean distinction between compressed and uncompressed and lossy/lossless in an absolute sense. Compression is implicitly (or explicitly) against some arbitrary choice of baseline. We normally call 16-bit PCM uncompressed and lossless but if your baseline is 32-bit floats, then it's lossy and compressed from that baseline.

justsomehnguy 2 days ago | parent [-]

> Why? What are you trying to solve here? You're going to have a hard time making a new format that serves you better than any of the existing formats.

Storage space. But this is the way for the same guys who duplicate 20Gb seven times 'to serve better by the industry standard'.

More sane people would just pack that AD/PCM in a .pk3^W sorry in a .zip file (or any other packaging format with LZ/7z/whatever compatible compression method) with the fastest profile and would have the best of the both worlds: sane storage requirements, uncompressed in memory. As a bonus it would be loaded faster from HDD because a data chunk which is 10 times smaller than uncompressed one would be loaded surprise 10 times faster.

danbolt a day ago | parent | prev [-]

Within the scope of a game’s production, the programmer time spent dogfooding the new audio format can be used towards something else that improves the value of the end product.

The uncompressed audio for latency-sensitive one-shots usually isn’t taking up the bulk of memory either.

justsomehnguy 5 hours ago | parent [-]

> programmer time spent dogfooding the new audio format can be used towards something else that improves the value of the end product

Like exploring the 'widely accepted industry practices' and writing code to duplicate the assets, then writing the code to actually measure what it did what the 'industry practices' advertised and then ripping this out, right?

And please note what you missed the 'if it really bothers you'.

ycombinatrix 2 days ago | parent | prev [-]

I think GP was confused - Titanfall 1 from 2014 is the one with the massive volume of uncompressed audio. Though I think your point still stands.

I was trying to point out that the decision to compress or not compress audio likely has nothing to do with the download size.

maccard a day ago | parent | prev | next [-]

> All Steam downloads are automatically compressed.

Titanfall wasn't on steam when it launched.

> It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

The person that I replied to (not you) claimed "They said it was for performance but the theory was to make it more inconvenient for pirates to distribute."

justsomehnguy 2 days ago | parent | prev [-]

> The point is that playback of uncompressed audio

Bullshit. This is not a problem since 2003.

And nobody forbids you to actually decompress your compressed audio when you are loading the assets from the disk.

recursive 2 days ago | parent | prev | next [-]

I remember seeing warez game releases in the late 90s that had custom packaging to de-compress sound effects that were stored uncompressed in the original installer.

It seems no one takes pride in their piracy anymore.

ycombinatrix 2 days ago | parent | prev | next [-]

Wasn't that Titanfall 1? I remember Titanfall 2 having a much smaller installation size.

snet0 2 days ago | parent | prev [-]

This is conspiratorial nonsense.

scsh 2 days ago | parent | prev | next [-]

It's because shitting on game devs is the trendy thing these days, even among more technically inclined crowds unfortunately. It seems like there's a general unwillingness to accept that game development is hard and you can't just wave the magic "optimize" wand at everything when your large project is also a world of edge cases. But it seems like it should be that simple according to all the armchair game devs on the internet.

buildbot 2 days ago | parent | next [-]

The level of work that goes into even “small” games is pretty incredible. When I was a grad student another student was doing their (thesis based, research focused) masters while working at EA on a streetfighter(?) game.

The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.

red-iron-pine 2 days ago | parent | prev | next [-]

the engineers disease: "i'm smarter than you and I need to prove it, and we're so smart we wouldn't have shipped this code in the first place" bla bla bla

also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"

jeffwask 2 days ago | parent | prev | next [-]

For me it's not so much about shitting on game devs as it is about shitting on the ogres that run game companies. Any of us who have done development should understand we have little control over scope and often want to do more than the business allows us to.

scsh 2 days ago | parent [-]

That is completely ok in my opinion. It's just most discourse I come across treats the developers as complete amateurs who don't know what they're doing. As someone who's a professional dev myself I just can't get behind bashing the people doing the actual work when I know we're all dealing with the same business realities, regardless of industry.

embedding-shape 2 days ago | parent | prev | next [-]

Meh, the same is true for almost every discussion on the internet, everyone is an expert armchair for whatever subject you come across, and when you ask them about their experience it boils down to "I read lots of Wikipedia articles".

I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.

Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.

AngryData 2 days ago | parent [-]

I wish many people's "expertise" atleast amounted to reading wikipedia. It seems for many that is too much and they either make crap up on the spot or latch onto whatever the first thing they find that will confirm their biases regardless of how true it is.

taeric 2 days ago | parent | prev [-]

There has long been a trend that "software engineers" and "computer scientists" both have been rather uninterested in learning the strategies that gaming developers use.

Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.

reactordev 2 days ago | parent | prev | next [-]

The negativity comes from the zero effort they put into this prior to launch. Forcing people to download gigs of data that was unnecessary.

Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.

This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.

bluedino 2 days ago | parent | next [-]

> They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

Reminds me of the Crack.com interview with Jonathan Clark:

Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.

https://web.archive.org/web/20160125143707/http://www.loonyg...

reactordev 2 days ago | parent [-]

this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".

fyrabanks 2 days ago | parent | prev [-]

There were 20 people working on this game when they started development. Total. I think they expanded to a little over 100. This isn't some huge game studio that has time to do optimization.

GTA5 had well over 1000 people on its team.

onli 2 days ago | parent | next [-]

Not sure GTA 5 is the right example to list here. Remember https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times.... At least for a while they didn't optimize at all.

reactordev 2 days ago | parent | prev [-]

Size of team has no bearing in this argument. Saying they were small so they get a pass at preventing obscene download sizes is like saying “Napster was created by one man, surely he shouldn’t be accountable” but he was.

When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.

WheatMillington 2 days ago | parent [-]

Amount of resources absolutely has a bearing on how resources can be allocated.

reactordev 2 days ago | parent [-]

This isn’t a resourcing issue. It’s a lack of knowledge and skipped a step issue.

When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.

Krasnol 2 days ago | parent | prev | next [-]

I feel like negativity has become Hacker News's bread and butter.

2 days ago | parent | prev | next [-]
[deleted]
2 days ago | parent | prev | next [-]
[deleted]
vict7 2 days ago | parent | prev | next [-]

Many players perceive Arrowhead as a pretty incompetent and untrustworthy developer. Helldivers has suffered numerous issues with both performance and balancing. The bugs constantly introduced into the game (not the fun kind you get to shoot with a gun) have eroded a lot of trust and good will towards the company and point towards a largely non-existent QA process.

I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.

MattGaiser 2 days ago | parent | prev | next [-]

Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

Many would consider this a bare minimum rather than something worthy of praise.

mschuster91 2 days ago | parent [-]

> Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.

Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.

But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.

zamadatix 2 days ago | parent | next [-]

Each person seems to have their own bugbear about Electron but I really doubt improving Electron to have shared instances a la WebView2 would make the much of a dent in the hate for it here.

saratogacx 2 days ago | parent | prev | next [-]

Removing layers is hard though, better to have electron host a WASM application which will become a new "native" that gets argued semantically.

jauntywundrkind 2 days ago | parent | prev [-]

Or these devs & users can migrate to a PWA. Which will have vastly less overhead. Because it is shared, because each of those 10 apps you mention would be (or could be, if they have ok data architecture) tiny.

mschuster91 2 days ago | parent [-]

> Or these devs & users can migrate to a PWA

PWAs have the problem that for every interaction with the "real world" they need browser approval. While that is for a good reason, it also messes with the expectations of the user, and some stuff such as unrestricted access to the file system isn't available to web apps at all.

eurekin 2 days ago | parent | prev [-]

The negativity wasn't created in a vacuum. ArrowHead has a long track record of technical mishaps and a proven history of erasing all evidence about those issues, without ever trying to acknowledge them. Reddits, Discord and YouTube comment section are heavily moderated. I suspect there's might be a 3rd party involved in this, which doesn't forward any technical issues, if the complaint involves any sign of frustration. Even the relation with their so called "Propaganda Commanders" (official moniker for their youtube partner channels) has been significantly strained in two cases, for trivialities.

It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.

Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.

They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.

sigmoid10 2 days ago | parent [-]

Sony nearly killed all good will the game had accrued when they tried to use the massive player base as an opportunity to force people into their worthless ecosystem. I don't think Sony even has the capability to make good technical decisions here, they are just the publisher. It was always Arrowhead trying to keep up with their massive success that they clearly weren't prepared for at all. In the beginning they simply listened to some very vocal players' complaints, which turned out to not be what the majority actually wanted. Player driven development is hardly ever good for a game.

eurekin 2 days ago | parent [-]

So, players wanting:

- To their PC not reboot and BSOD (was a case few months ago)

- Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

- Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)

- Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)

- Actually be able to hear one of the biggest enemies in the game

- To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)

- Be able to use chat, when in the vehicle (this would result in using your primary weapon)

- Be able to finish drill type mission (this bugs out a lot still)

- Not be attacked by enemies that faze through buildings

- Not be attacked by bullets passing through terrain, despite the player bullets being stopped there

are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).

About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.

FieryMechanic 2 days ago | parent | next [-]

As someone with 700 hours in the game, I've played the game both on Windows and Linux.

A lot of issues are to do with the fact that the game seems to corrupt itself. If I have issues (usually performance related), I do a steam integrity check and I have zero issues afterwards. BTW, I've had to do this on several games now, so this isn't something that is unique to HellDivers. My hardware is good BTW, I check in various utils and the drives are "ok" as far as I can tell.

> - To their PC not reboot and BSOD (was a case few months ago)

This was hyped up by a few big YouTubers. The BSODs was because their PCs were broken. One literally had a burn mark on their processor (a known issue with some boards/processor combos) and the BSODs went away when they replaced their processor. This tells me that there was something wrong with their PC and any game would have caused a BSOD.

So I am extremely sceptical of any claims of BSODs because of a game. What almost is always the case is that the OS or the hardware is at issue and playing a game will trigger the issue.

If you are experiencing BSODs I would make sure your hardware and OS are actually good, because they are probably not. BTW I haven't a BSOD in Windows for about a decade because I don't buy crap hardware.

> - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.

> Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)

This happened for like about a week for some people and I personally didn't experience this.

> To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)

I've not experience this. Not heard anyone complain about this and I am in like 4 different HellDivers focus'd discord servers

> Not be attacked by enemies that faze through buildings

This can be annoying, but it happens like once in a while. It isn't the end of the world.

gfaster 2 days ago | parent | next [-]

> So I am extremely sceptical of any claims of BSODs because of a game.

Generally speaking, I am too. That is unless there is kernel-level anticheat. In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle

eurekin 2 days ago | parent | next [-]

I had them and I keep observing this strange tendency to wipe that particular issue out of existence

FieryMechanic 2 days ago | parent | prev [-]

> In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle

I am sorry but that is asinine and unscientific. You should blame BSODs on what is causing them. I don't like kernel anti-cheat but I will blame the actual cause of the issues, not assign blame on things which I don't approve of.

I am a long time Linux user and many of the people complaining about BSODs on Windows had a broken the OS in one way or another. Some were running weird stuff like 3rd party shell extensions that modify core DLLs, or they had installed every POS shovelware/shareware crap. That isn't Microsoft's fault if you start running an unsupported configuration of the OS.

Similarly. The YouTubers that were most vocal about HellDivers problems did basically no proper investigation other than saying "look it crashed", when it was quite clearly their broken hardware that was the issue. As previously stated their CPU had a burn mark on one of the pins, some AM5 had faults that caused this IIRC. So everything indicated hardware failure being the cause of the BSOD. They still blamed the game, probably because it got them more watch time.

During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.

gfaster 2 days ago | parent [-]

> that is asinine and unscientific

Well, yes. I did say something to that effect. Blaming BSODs on invasive anti-cheat out of principle is a political position, not a scientific one.

> During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.

I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.

Regardless, I don't doubt that this issue with Helldivers 2 was caused by (or at least surfaced by) certain hardware, but that does not change that given such an issue, I would presume the culprit is kernel anticheat until presented strong evidence to the contrary.

FieryMechanic 2 days ago | parent [-]

> Well, yes. I did say something to that effect. Blaming BSODs on invasive anti-cheat out of principle is a political position, not a scientific one.

When there are actual valid concerns about the anti-cheat, these will be ignored because of people that assigned blame to it when unwarranted. This is why making statements based on your ideology can be problematic.

> I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.

People were claiming it was causing hardware damage which is extremely unlikely since both Intel, AMD and most hardware manufacturers have mechanisms which prevent this. This isn't some sort of opaque race condition.

> RI would presume the culprit is kernel anti-cheat until presented strong evidence to the contrary.

You should know that if you making assumptions without evidence that will often lead you astray.

I don't like kernel anti-cheat and would prefer for it not to exist, but making stupid statements based on ideology instead of evidence just makes you look silly.

eurekin 2 days ago | parent | prev | next [-]

> - To their PC not reboot and BSOD (was a case few months ago)

I was just about to replace my gpu (4090 at that!), I had them 3 times a session. I did sink a lot of hours to debug that (replaced cables, switched PSUs between desktops) and just gave up. After few weeks, lo and behold, a patch comes out and it all disappears.

A lot of people just repeat hearsay about the game

eurekin 2 days ago | parent | prev | next [-]

> > - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

> False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.

I specifically mean the exact time, right after the pelican starts to fly. I keep seeing "<player> left" or "disconnected". Some come back and I have a habit of asking: "Crash?", they respond with "yeah"

FieryMechanic 2 days ago | parent [-]

If that is happening, they need to do a Steam Integrity check. I understand the game is buggy, but it isn't that buggy.

XorNot 2 days ago | parent | prev [-]

It's basically an Internet fable at this point that there's "a game that physically damages your hardware".

The answer to every such claim is just: no. But it's click bait gold to the brain damage outrage YouTuber brigade.

Accidentally using a ton of resources might e reveal weaknesses, but it is absolutely not any software vendors problem that 100% load might reveal your thermal paste application sucked or Nvidia is skimping on cable load balancing.

eurekin 2 days ago | parent | next [-]

Trust me, I'm a software developer with more than two decades of experience. Have been dabbling in hardware since the Amiga 500 era. "I have that specific set of skills" that allows me to narrow down a class of issues pretty well - just a lot of component switching in a binary divide and conquer fashion across hardware.

The issue is 1) actually exaggarated in the community, but not without actual substance 2) getting disregarded exactly because of exaggarations. It was a very real thing.

I also happen to have a multi gpu workstation that works flawlessly too

FieryMechanic 2 days ago | parent | prev [-]

This was pretty much my take as well. I have an older CPU, Motherboard and GPU combo before the newer GPU power cables that obviously weren't tested properly and I have no problems with stability.

These guys are running an intensive game on the highest difficulty, while streaming and they probably have a bunch of browser windows and other software running background. Any weakness in the system is going to be revealed.

I had performance issues during that time and I had to restart game every 5 matches. But it takes like a minute to restart the game.

sigmoid10 a day ago | parent | prev [-]

>I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.

I don't know what you mean. The game literally got 84,000 negative reviews within 24 hours after Sony tried to force PSN on everyone. No bug or missing feature ever came anywhere close to this kind of negative sentiment toward the game.