Remix.run Logo
mmsc 6 hours ago

This AI usage is like a turbo-charger for the Dunning–Kruger effect, and we will see these policies crop up more and more, as technical people become more and more harassed and burnt out by AI slop.

I also recently wrote a similar policy[0] for my fork of a codebase. I had to write this because the original developer took the AI pill, and starting committing totally broken code that was fulled of bugs, and doubled down when asked about it [1].

On an analysis level, I recently commented[2] that "Non-coders using AI to program are effectively non-technical people, equipped with the over-confidence of technical people. Proper training would turn those people into coders that are technical people. Traditional training techniques and material cannot work, as they are targeted and created with technical people in mind."

But what's more, we're also seeing programmers use AI creating slop. They're effectively technical people equipped with their initial over-confidence, highly inflated by a sense of effortless capability. Before AI, developers were once (sometimes) forced to pause, investigate, and understand, and now it's just easier and more natural to simply assume they grasp far more than they actually do, because @grok told them this is true.

[0]: https://gixy.io/contributing/#ai-llm-tooling-usage-policy

[1]: https://joshua.hu/gixy-ng-new-version-gixy-updated-checks#qu...

[2]: https://joshua.hu/ai-slop-story-nginx-leaking-dns-chatgpt#fi...