Remix.run Logo
superice 4 hours ago

Aside from the legal question whether the manufacturer allows you to do so, I’m pretty excited about somebody vibe coding firmwire to a 35 ton machine with a bunch of big attachments at the back and plenty of ways to mangle the bodies of careless operators without the rpm so much as audibly rising from strain. Should give us plenty of videos to traumatize the next generation of children with a little bit too much internet access at an early age. I feel nostalgic for those days.

(This is sarcasm, pretty please don’t vibe code car firmware, let alone anything more dangerous than that)

slopinthebag 4 hours ago | parent [-]

As long as you have a sufficient test suite you could probably run a Ralph Wiggum loop and have it brute force it. Creating the test suite would be harder though.

skinwill 2 hours ago | parent | next [-]

The phrase "sufficient test suite" is doing a LOT of work here. You would need to know what the data from every sensor is supposed to be along with how every piece of the machine is supposed to perform. AI isn't going to be able to iterate into those parameters over night.

superice 2 hours ago | parent | prev [-]

I've never been fond of the argument that there should be a professional software engineer certification, but hearing people like you being presented with the potential dangers and just going 'oh yeah just go with a better test suite and you can just wing it' makes me seriously reconsider.

Vibe code administrative systems for your local golf club to your hearts desire for all I care, god forbid somebody will have to stand around a bit longer before going for their 9 holes. But safety critical equipment is not the place to fuck around with the code prediction machines that have existed for 4 years, have been writing more-or-less acceptable code for 2, and will still regularly refer to themselves as MechaHitler or just make up shit. "Yes you're absolutely correct, I was wrong" doesn't help you one bit if you have just been chewed up by heavy machinery, and the fact that people like you exist who go 'oh just a few more more unit tests surely will fix it' is a terrifying thought.

slopinthebag 38 minutes ago | parent [-]

But don't humans make mistakes too? Like are we sure the failure rate of AI with the right checks and bounds is lower than humans, who are flawed machines themselves?

If you need assurances, have a different LLM write the test suite.