Remix.run Logo
snailmailman a day ago

This case is wild and seems to perfectly encapsulate all the problems people complain about with vibecoded projects.

The "rewrite it in rust" commit is +1M lines of code. Humans haven't looked at that in depth. In about a week, they saw the tests passed and pushed it to main. Now people have started to look through it and are pointing out glaring issues. And the solution is just going to be "feed it to another AI and ask it to fix it".

The entire codebase is slop now. Nobody knows what it does. It manages to pass some tests, but its largely a black box just on the basis of humans haven't read it yet. The code isn't guaranteed to be anything close to 1:1 with the old codebase. Its probably vaguely shaped like the old codebase, but new bugs could be there, old bugs could be there, nobody knows anything yet.

Its going to be interesting to see how recoverable this is. They are almost certainly going to just hand every file to an AI, say "look for soundness issues and fix them" and then what? If AI is making huge, sweeping changes to the code so frequently that humans can't keep up, is that really maintainable? The only solution appears to be "even more AI" while anybody that looks closely gets scared away by the too-large-to-comprehend-and-entirely-slop codebase.

This kind of thing has been happening with many smaller projects already, but now its a larger project and happening in a much more public way, with the intent to replace human-written, mostly-understood code with slop. I suspect the same thing, with the same problems, is happening inside all the largest companies, just not quite as obviously.

lioeters a day ago | parent | next [-]

> only solution appears to be "even more AI"

That's the idea, to transform businesses to be wholly dependent on "AI" service to develop software. What better way than to re/write entire codebases until no human being understands it.

The Zig project know this, and its so-called "anti-AI" policy is actually pro-community and cultivating human understanding. It's not about the tool or technology, per se, it's about people, knowledge, and sustainability.

In contrast, the Bun project is demonstrating how they doesn't care about any of that, YOLO-ing its way to losing the trust of its users, contributors, and maintainers. Oh well, AI will maintain the project now, since no one else can.

soraminazuki 20 hours ago | parent [-]

The one thing I can't stand about the AI zealots is their anti‑intellectualism. Even before coding agents became a thing, there were so many comments here along the lines of, "doing things properly has a learning cost! I don't have time for that nonsense because, unlike you, I'm busy actually making stuff." Now, too many people openly mock the practice of reading, writing, or understanding code altogether.

It's sad to see what hacker culture has been reduced to: outright contempt for science and engineering.

akkad33 10 hours ago | parent | next [-]

One thinking is most people writing software who are not software engineers prefer using AI because they don't think software is valuable in itself, it's only a way to solve a problem. So there are two camps, the other being people who like to solve "software problems". But this latter has been solved by AI

soraminazuki 8 hours ago | parent [-]

That's exactly thing I'm trying to call out. AI coding has attracted a flood of people whose only goal is to make a quick buck out of shoddy work. They regard science and engineering as beneath them, and they're not shy about saying it, here and elsewhere.

Any serious professional in this field knows that software development is far from a solved problem. It wasn't before LLMs, and it isn't now. Responsible development takes discipline and respect for the hard-won lessons of past and present efforts.

But no, according to many here, being responsible makes you a "luddite." "Humans make mistakes too," that's what they'll say as they'll inevitably screw over people's lives with their reckless disregard for others. "It's not my issue to solve."

Seriously, haven't techbros already caused enough damage throughout society with "move fast and break things"? A lot of people are losing patience for this nonsense.

risyachka 10 hours ago | parent | prev [-]

This is because AI is most appealing to average and below average developers and users because it makes them feel like they can finally do something.

tempest_ a day ago | parent | prev | next [-]

This is more or less my take on it.

I am not against AI code, it can be perfectly fine.

The principle issue in my mind is the rate of change.

Once you rewrite a code base like this (in a week no less) the only way to work on it in the future is using AI tools because no single person has any knowledge about any specific piece of code base any more.

AI generated code that is run through a classic PR process would potentially be fine, but then you sorta lose the entire point of using AI.

jwpapi a day ago | parent | prev [-]

That happened to my project as well. The main issue hasn’t beet that ai couldn’t solve the problem, but it became so slow and you need more and more verification layers and CI/CD that at one point you wish a simpler codebase back, with reasonable tests, with storylines in codes and so on.