| ▲ | lovich 3 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It’s very frustrating for the bot’s reinforcement function. This is another account created after widespread access to LLM was available to the public that is pushing a political view that is somewhat coherent until pressed and then it falls apart like all chat bots Maybe it’s a real person and I’m being an asshole here, but it’s hard to tell. The fact that is hard to tell if they are real or not means we need to come up with a heuristic to identify actual humans now that passing the Turing test has become trivially cheap. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | dang 2 days ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> Maybe it’s a real person and I’m being an asshole here, but it’s hard to tell. The site guidelines are clear on this: you should assume that it's a real person and try your best to reel back these sorts of accusations, which are nearly always wrong, and nearly always driven by differences of background and (therefore) opinion. https://news.ycombinator.com/newsguidelines.html I'm rushing out the door just now but here are a couple of past explanations about this: https://news.ycombinator.com/item?id=35932851 (May 2023) https://news.ycombinator.com/item?id=41948722 (Oct 2024) (as well as https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme... of course) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | alex1138 3 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
[flagged] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||