Remix.run Logo
BinaryIgor a day ago

I agree with the first part of your comment, but don't follow the rest - why SE you should be more tightly regulated? It doesn't need to be; if anything, it will just stifle its progress and evolution

sublinear a day ago | parent [-]

I think AI will make more visible where code diverges from the average. Maybe auditing will be the killer app for near-future AI.

I'm also thinking about a world where more programmers are trying to enter the workforce self-taught using AI. The current world is the continued lowering of education standards and political climate against universities.

The answer to all of the above from the perspective of who don't know or really care about the details may be to cut the knot and impose regulation.

Delegate the details to auditors with AI. We're kinda already doing this on the cybersecurity front. Think about all the ads you see nowadays for earning your "cybersecurity certification" from an online-only university. Those jobs are real and people are hiring, but the expertise is still lacking because there aren't clearer guidelines yet.

With the current technology and generations of people we have, how else but AI can you translate NIST requirements, vulnerability reports, and other docs that don't even exist yet but soon will into pointing someone who doesn't really know how to code towards a line of code they can investigate? The tools we have right now like SAST and DAST are full of false positives and non-devs are stumped how to assess them.

Even before all this latest round of AI stuff it's been a concern that we overwork and overtrust devs. Principle of least privilege isn't really enough and is often violated in any scenario that isn't the usual day-to-day work.