▲ | simonw 3 days ago | |
> Normalizing LLMs as “lossy encyclopedias” is a dangerous trend in my opinion, because it effectively handwaves the need for critical thinking skills associated with research and complex task execution Calling them "lossy encyclopedias" isn't intended as a compliment! The whole point of the analogy is to emphasize that using them in place of an encyclopedia is a bad way to apply them. | ||
▲ | stego-tech 3 days ago | parent [-] | |
That might’ve been the author’s intent, but the comments in this thread (and downvotes of my opinions) suggest that a non-zero number of people believe that analogy to be the best justification yet for LLMs-as-answer-engines that shouldn’t be assailable by dissenters or critics. So long as people are dumb enough to gleefully cede their expertise and sovereignty to a chatbot, I’ll keep desperately screaming into the void that they’re idiots for doing so. |