| |
| ▲ | ethin 7 hours ago | parent | next [-] | | They are easy to avoid if you actually give a damn. Unfortunately, people who create these things don't, assuming they even know what even half of these attacks are in the first place. They just want to pump out something now now now and the mindset is "we'll figure out all the problems later, I want my cake now now now now!" Maximum velocity! Full throttle! It's just as bad as a lot of the vibe-coders I've seen. I literally saw this vibe-coder who created an app without even knowing what they wanted to create (as in, what it would do), and the AI they were using to vibe-code literally handwrote a PE parser to load DLLs instead of using LoadLibrary or delay loading. Which, really, is the natural consequence of giving someone access to software engineering tools when they don't know the first thing about it. Is that gatekeeping of a sort? Maybe, but I'd rather have that then "anyone can write software, and oh by the way this app reimplements wcslen in Rust because the vibe-coder had no idea what they were even doing". | | |
| ▲ | lxgr 7 hours ago | parent | next [-] | | > "we'll figure out all the problems later, I want my cake now now now now!" Maximum velocity! Full throttle! That is indeed the point. Moltbot reminds me a lot of the demon core experiment(s): Laughably reckless in hindsight, but ultimately also an artifact of a time of massive scientific progress. > Is that gatekeeping of a sort? Maybe, but I'd rather have that Serious question: What do you gain from people not being able to vibe code? | | |
| ▲ | hugey010 7 hours ago | parent [-] | | Not who you're responding to, but I'm not a huge fan of vibe coding for 2 reasons: I don't want to use crappy software, and I don't want to inherit crappy software. | | |
| ▲ | lxgr 6 hours ago | parent [-] | | Same, but I've both used and inherited crappy software long before LLMs and agents were a thing. I suppose it's going to be harder to identify obvious slop at a first glance, but fundamentally, what changes? |
|
| |
| ▲ | chrisjj 7 hours ago | parent | prev | next [-] | | > They just want to pump out something now now now Some people actually fell for "move fast and break things". | |
| ▲ | ejcho 7 hours ago | parent | prev [-] | | I think with the advent of the AI gold rush, this is exactly the mentality that has proliferated throughout new AI startups. Just ship anything and everything as fast as possible because all that matters is growth at all costs. Security is hard and it takes time, diligence, and effort and investors aren't going to be looking at the metric of "days without security incident" when flinging cash into your dumpster fire. |
| |
| ▲ | chrisjj 7 hours ago | parent | prev [-] | | > At least currently, I don't think we have good ways of preventing the former, but the latter should be possible to avoid. Here's the thing. People who don't see a problem with the former obviously have no interest in addressing the latter. |
|