| ▲ | caymanjim 5 hours ago | |||||||||||||||||||||||||
There's no reason to care that a human spent time on it. Humans are bad at writing code. Garbage PRs and slop have been a problem in open source and bug bounty programs since long before AI came on the scene. We need better AI so that there's no need to solicit external bug fixes, and better AI so other contributions can be evaluated for usefulness and quality. What do you care if a human ever looked at it at all? It implies that humans are adding value to the process. It's possible for a human to add value. The right human can add tremendous value. But I'll take a completely autonomous AI over 99% of the human software engineers and 99% of the people contributing PRs and bugfixes. It was hard to keep up with slop before. It's a lot harder now. AI will help weed through the garbage. | ||||||||||||||||||||||||||
| ▲ | 48terry 4 hours ago | parent | next [-] | |||||||||||||||||||||||||
If AI is already mass-producing garbage PRs and other unreliable crap, what makes AI (established as producing unreliable crap) the solution for review? What makes the reviewing AI not produce unreliable crap with regards to the review? A magical, hypothetical AI that always gets it right and will make all these problems go away is neither a solution nor a plan. It's wishful thinking. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | bcjdjsndon 5 hours ago | parent | prev [-] | |||||||||||||||||||||||||
Reasonable logic but I bet you get downvoted | ||||||||||||||||||||||||||