Remix.run Logo
doug_durham 21 hours ago

This isn't an AI issue. It is a care issue. People shouldn't submit PRs to project where they don't care enough to understand the project they are submitting to or the code they are submitting. This has always been a problem, there is nothing new. The thing that is new is more people can get to a point where they can submit regardless of their care or understanding. A lot of people are trying to gild their resume by saying they contributed to a project. Blaming AI is blaming the wrong problem. AI is a a tool like a spreadsheet. Project owners should instead be working ways to filter out careless code more efficiently.

johnnyanmac 7 hours ago | parent | next [-]

That's why I'm not super optimistic. Even pre-AI and tech slump there were talks about how hard it may be to replace the old guard maintaining these open source initiatives. Now...

>Blaming AI is blaming the wrong problem. AI is a a tool like a spreadsheet. Project owners should instead be working ways to filter out careless code more efficiently.

When care leaves, the entire commons starts to fall apart. New talent doesn't come in. Old talent won't put up with it and retire out of the scene. They already have so much work to do, needing to add in non-development work to make better spam filters may very well be the final stray.

Even when the careless leave, it won't bring back the talent lost. Directing the blame onto the sure won't do that.

exasperaited 20 hours ago | parent | prev [-]

This is an AI issue because people, including the developers of AI tools, don't care enough.

The Tragedy Of The Commons is always about this: people want what they want, and they do not care to prevent the tragedy, if they even recognise it.

> Project owners should instead be working ways to filter out careless code more efficiently.

Great. So the industry creates a burden and then forces people to deal with it — I guess it's an opportunity to sell some AI detection tools.

danielbln 8 hours ago | parent [-]

We don't need an AI detector, we need a "human vetted" detector.

johnnyanmac 7 hours ago | parent | next [-]

Who's paying the human to vet it? Or will we have volunteers dedicated to being AI detectors instead of developers?

danielbln 7 hours ago | parent [-]

I don't have those answers. My point was that trying to outright ban any AI is futile and probably overall counter productive, and that we need to find ways to ensure a human hasn't submitted slop. I don't have an answer as to the how.

exasperaited 2 hours ago | parent | prev [-]

People arguing against my point here seem to be doing a good job of validating my point.