Remix.run Logo
AshleysBrain 2 hours ago

Is this not a good example of how generative AI does copyright laundering? Suppose the image was AI generated and it did a bad copy of the source image that was in the training data, which seems likely with such a widely disseminated image. When using generative AI to produce anything else, how do you know it's not just doing a bad quality copy-paste of someone else's work? Are you going to scour the internet for the source? Will the AI tell you? What if code generation is copy-pasting GPL-licensed code in to your proprietary codebase? The likelihood of this, the lack of a way to easily know it's happening, and the risks it causes, seems to me to be being overlooked amidst all the AI hype. And generative AI is a lot less impressive if it often works as a bad quality copy paste tool rather than the galaxy brain intelligence some like to portray it as.

ezst 10 minutes ago | parent | next [-]

> What if code generation is copy-pasting GPL-licensed code in to your proprietary codebase?

This is obviously a big, unanswered, issue. It's pretty clear to me that we are collectively incentivised to pollute the well, and that it happens for long-enough for everything to become "compromised". That's essentially abandoning opensource and IP licensing at large, taking us to an unchartered era where intellectual works become the protected property of nobody.

I see chatbots having less an impact on our societies than the above, and interestingly it has little to do with technology.

Gigachad 2 hours ago | parent | prev | next [-]

There are countless examples. Often I think about the fact that the google search AI is just rewording news articles from the search results, when you look at the source articles they have exactly the same points as the AI answers.

So these services depends on journalists to continuously feed them articles, while stealing all of the viewers by automatically copying every article.

nicbou 36 minutes ago | parent | next [-]

Yes, and it's slowly killing those websites. Mine is among them and the loss in traffic is around 60%.

jll29 2 hours ago | parent | prev | next [-]

Of course Google has a history of copying articles in whole (cf. Google Cache, eventually abandoned).

AlienRobot 2 hours ago | parent | prev [-]

I actually often have the opposite problem. The AI overview will assert something and give me dozens of links, and then I'm forced to check them one by one to try to figure out where the assertion came from, and, in some cases, none of the articles even say what the AI overview claimed they said.

I honestly don't get it. All I want is for it to quote verbatim and link to the source. This isn't hard, and there is no way the engineers at Google don't know how to write a thesis with citations. How did things end up this way?

jll29 2 hours ago | parent [-]

ChatGPT was a research prototype thrown at end users as a "product".

It is not a carefully designed product; ask yourself "What is it FOR?".

But the identification of reliable sources isn't as easy as you may think, either. A chat-based interaction really makes most sense if you can rely on every answer, otherwise the user is misled and user and conversation may go in a wrong direction. The previous search paradigm ("ten snippets + links") did not project the confidence that turns out is not grounded in truth that the chat paradigm does.

UqWBcuFx6NV4r 2 hours ago | parent | prev [-]

If you actually care about having that sort of discussion I’d suggest a framing that doesn’t paint anyone that doesn’t agree with you as succumbing to AI hype and believing it has “galaxy brain intelligence”. Please ditch this false dichotomy. At this point, in 2026, it’s tiring.