| ▲ | hvs 3 hours ago | |
Came here to say that a lot of these LLM posts make me feel like I was hit by a hammer and I can't understand the world anymore. Thankfully, the HN comments confirm that this is as insane as I thought it was. | ||
| ▲ | femto 2 hours ago | parent [-] | |
I think it's because LLM output tends to be snippets of truth, connected in such a way as to be subtly false. It leaves you disoriented because it is in the uncanny valley of truth: you know it's not right, but can't put your finger on why it is not right. It's made worse buy its asymmetric nature: it takes seconds to generate pages of words and hours to figure out what is wrong with them, for little to no gain to the reader. | ||