Remix.run Logo
Gigachad 9 hours ago

It's also seemingly the only way ChatGPT knows how to write, while being very uncommon for blogposts beforehand. Of course it's not 100% proof, but it's the most likely explanation.

WalterGR 9 hours ago | parent [-]

It has a name. The Rule of Threes. https://en.wikipedia.org/wiki/Rule_of_three_(writing)

“The rule of three is a writing principle which suggests that a trio of entities such as events or characters is more satisfying, effective, or humorous than other numbers, hence also more memorable, because it combines both brevity and rhythm with the smallest amount of information needed to create a pattern.”

It’s how I was taught to write, but I understand that my personal experience can’t be generalized to make sweeping statements.

Do you have data that suggests it’s uncommon in human-authored blog posts and more common in LLM-generated text?

palmotea 8 hours ago | parent [-]

> It has a name. The Rule of Threes. https://en.wikipedia.org/wiki/Rule_of_three_(writing)

I don't think that's exactly it.

Speaking of LLM-writing in general, it seems to greatly overuse certain types of constructions or use them in uncommon contexts. So that probably isn't so much using the rule of threes, but overusing the rule of threes in certain specific ways in certain specific contexts.

WalterGR 8 hours ago | parent [-]

I don’t necessarily doubt you or the grand-parent comment, but if it’s ‘obvious to even the most casual of observers’ (as my father would say) then it should be easy to have hard data.