Remix.run Logo
reactordev 2 days ago

The negativity comes from the zero effort they put into this prior to launch. Forcing people to download gigs of data that was unnecessary.

Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.

This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.

bluedino 2 days ago | parent | next [-]

> They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

Reminds me of the Crack.com interview with Jonathan Clark:

Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.

https://web.archive.org/web/20160125143707/http://www.loonyg...

reactordev 2 days ago | parent [-]

this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".

fyrabanks 2 days ago | parent | prev [-]

There were 20 people working on this game when they started development. Total. I think they expanded to a little over 100. This isn't some huge game studio that has time to do optimization.

GTA5 had well over 1000 people on its team.

onli 2 days ago | parent | next [-]

Not sure GTA 5 is the right example to list here. Remember https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times.... At least for a while they didn't optimize at all.

reactordev 2 days ago | parent | prev [-]

Size of team has no bearing in this argument. Saying they were small so they get a pass at preventing obscene download sizes is like saying “Napster was created by one man, surely he shouldn’t be accountable” but he was.

When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.

WheatMillington 2 days ago | parent [-]

Amount of resources absolutely has a bearing on how resources can be allocated.

reactordev 2 days ago | parent [-]

This isn’t a resourcing issue. It’s a lack of knowledge and skipped a step issue.

When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.