Remix.run Logo
iamcalledrob 14 hours ago

Sloppy technical design ends up manifesting in bugs, experiential jank, and instability.

There are some types of software (e.g. websites especially), where a bit of jank and is generally acceptable. Sessions are relatively short, and your users can reload the webpage if things stop working. The technical rigor of these codebases tends to be poor, but it's generally fine.

Then there's software which is very sensitive to issues (e.g. a multi-player game server, a driver, or anything that's highly concurrent). The technical rigor here needs to be very high, because a single mistake can be devastating. This type of software attracts people who want to take pride in their code, because the quality really does matter.

I think these people are feeling threatened by LLMs. Not so much because an LLM is going to outperform them, but because an LLM will (currently) make poor technical design decisions that will eventually add up to the ruin of high-rigor software.

Benjammer 14 hours ago | parent [-]

> the quality really does matter.

If this level of quality/rigor does matter for something like a game, do you think the market will enforce this? If low rigor leads to a poor product, won't it sell less than a good product in this market? Shouldn't the market just naturally weed out the AI slop over time, assuming it's true that "quality really does matter"?

Or were you thinking about "matter" in some other sense than business/product success?

iamcalledrob 13 hours ago | parent | next [-]

Yes, I think the market will enforce this. A bit. Eventually. But the time horizon is long, and crummy software with a strong business moat can out-compete great software.

Look at Windows. It's objectively not been a good product for a long time. Its usage is almost entirely down to its moat.

tabwidth 9 hours ago | parent | prev | next [-]

How long does that take though? Technical debt from sloppy code doesn't show up in the product until way later. By the time users notice, the team is already three features deep and can't back out.

tokioyoyo 8 hours ago | parent [-]

All these arguments somehow disregards that we’ve all been adding technical debt left and right, every other day to every single codebase in existence. Humans also write sloppy code.

FridgeSeal 12 hours ago | parent | prev | next [-]

A lot of software is forced upon people against their will, and purchased bu people who will never use it.

This obscures things in favour of the “quality/performance doesn’t matter argument”.

I am, for example, forced to use a variety of microslop and zoom products. They are unequivocally garbage. Given the option, I would not use them. However, my employer has saddled us with them for reasons, and we must now deal with it.

bloppe 14 hours ago | parent | prev | next [-]

Yes, both the article and GP are making that exact point about it mattering from a customer's perspective.

14 hours ago | parent | prev | next [-]
[deleted]
SpicyLemonZest 13 hours ago | parent | prev | next [-]

Even if you're confident you can stop your own company from shipping terrible products, I worry the trend is broad enough and hard enough to audit that the market will enforce it by pulling back on all purchases of such software. If gamers learn that new multiplayer games are just always laggy these days, or CTOs learn that new databases are always less reliable, it's not so easy to convince them that your product is different than the rest.

theossuary 14 hours ago | parent | prev [-]

Yes, there's every reason to believe the market will weed out the AI slop. The problem is, just like with stocks, the market can stay irrational longer than you can stay solvent. While we all wait for executives to learn that code rigor matters, we still have bills to pay. After a year when they start trying to hire people to clean up their mess, we'll be the ones having to shovel a whole new level of shit; and the choice will be between that and starving.

As someone who also falls into camp one, and absolutely loves that we have thinking computers now, I can also recognize that we're angling towards a world of hurt over the next few years while a bunch of people in power have to learn hard lessons we'll all suffer for.