▲ | rurp 2 days ago | ||||||||||||||||||||||||||||
The recent taking of people's content for AI training might be the most blatant example of rich well connected people having different rules in our society that I've ever witnessed. If a random person copied mass amounts of IP and resold it in a different product with zero attribution or compensation, and that product directly undercut the business of those same IP producers, they would be thrown in jail. Normal people get treated as criminals for seeding a few movies, but the Sam Altmans of the world can break those laws on an unprecedented scale with no repercussions. As sad as it is, I think we're looking at the end of the open internet as we've known it. This is massive tragedy of the commons situation and there seems to be roughly zero political will to enact needed regulations to keep things fair and sustainable. The costs of this trend are massive, but they are spread out across many millions of disparate producers and consumers, while the gains are extremely concentrated in the hands of the few; and those few have good lobbyists. | |||||||||||||||||||||||||||||
▲ | tim333 2 days ago | parent | next [-] | ||||||||||||||||||||||||||||
The trouble is what the LLMs do is effectively read a lot of articles and then produce a summary. What human writers do is quite similar - read a lot of stuff and then write their own article. It's quite hard to block what people have usually done because it's done by an LLM rather than a human. I mean even if you want to ban LLMs, if an article goes up how can you tell if it's 100% written by a human or the human used an LLM? | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
▲ | Drew_ 2 days ago | parent | prev [-] | ||||||||||||||||||||||||||||
I agree whole heartedly. It seems clear to me that art and knowledge will transition to more private and/or undocumented experiences in the coming years in order to preserve their value. |