| ▲ | raincole 4 hours ago |
| What changed is you, the reader. In 2026 we treat the smallest signs as evidence of LLM writing. Too long? LLM. Too short? LLM. Too grammatically correct? Must be LLM. |
|
| ▲ | sanitycheck 3 hours ago | parent [-] |
| For me it was the "it's not x"/"it's y" stuff and some other structures Claude is very fond of using all the time. Perhaps humans are starting to write like LLMs! |
| |
| ▲ | raincole 3 hours ago | parent [-] | | Perhaps, just perhaps, LLMs are just statistical models that literally can't create novel things, therefore any structure LLMs write was learnt from human writing? But who knows! | | |
| ▲ | bakugo 2 hours ago | parent [-] | | What kind of human writing has "it's not X—it's Y" in every single paragraph? The answer is none. LLMs haven't accurately modeled human writing for years, current models have been smacked on the head with the coding RLHF bat so much, they all write distinctly inhuman text. |
|
|