▲ | johnisgood 6 days ago | |||||||
I feel the same way. Thankfully there are still obvious signs in case of using LLMs, but it is not always so obvious. I think we may be better off assuming X is fake, and go from there. Sad but what could we do? There are websites that tell you (with a %) whether or not something has been written by an LLM. Unfortunately, however, some of my writings come out false positive. We may need to do improvements on this front, and I believe we will. | ||||||||
▲ | prmoustache 6 days ago | parent [-] | |||||||
reality can be faked even without use of LLMs. Take for instance instagram, youtube shorts and tiktok. I see people watching tons of small either supposedly funny or shocking videos. And people seem to believe they are totally real and not organize/produced content until I challenge them on a number of trivial details that make those videos totally unbelievable they would have been recorded by chance or in an opportunistic manner. | ||||||||
|