▲ | taurath a day ago | |||||||
This really gets to an acceleration of enshittification. If you can't tell its an LLM, and there's nobody to verify the information, humanity is architecting errors and mindfucks into everything. All of the markers of what is trustworthy has been coopted by untrustworthy machines, so all of the way's we'd previously differentiated actors have stopped working. It feels like we're just losing truth as rapidly as LLMs can generate mistakes. We've built a scoundrels paradise. How useful is a library of knowledge when n% of the information is suspect? We're all about to find out. | ||||||||
▲ | Henchman21 21 hours ago | parent [-] | |||||||
You know, things looked off to me, but thinking it was the output of an LLM just didn't seem obvious -- even though that was the claim! I feel ill-equipped to deal with this, and as the enshittification has progressed I've found myself using "the web" less and less. At this point, I'm not sure there's much left I value on the web. I wish the enshittification wasn't seemingly pervasive in life. | ||||||||
|