| ▲ | sigseg1v 3 hours ago | |
How is the getting stranded example different than asking on a travel forum how to get somewhere, and an active and well intentioned user that isn't familiar with your area of travel answers, gives you wrong instructions, and you get lost? | ||
| ▲ | andrewflnr 2 hours ago | parent | next [-] | |
The key missing step is where the traveler exercises critical thinking and checks the advice they get. Some people seem to turn that off for LLMs. | ||
| ▲ | shahbaby 3 hours ago | parent | prev | next [-] | |
Because they aren't probabilistic parrots? If they get it wrong, there's usually an understandable reason behind it. | ||
| ▲ | kibwen 3 hours ago | parent | prev [-] | |
Because the vast and overwhelmingly majority of the time, if you ask a question into the ether that nobody has a good answer to, most people will gloss over it and not bother answering, as attested by decades of relatable memes ( https://xkcd.com/979/ ). In contrast, the chatbot is trained to always attempt to give an answer, and is seemingly disincentivized via its training set to just shrug and say "I don't know, good luck fam". | ||