| ▲ | mort96 5 hours ago | |||||||||||||||||||||||||
As someone who has dealt with projects with AI-generated documentation... I can't really say I agree. Good documentation is terse, efficiently communicating the essential details. AI output is soooooooo damn verbose. What should've been a paragraph becomes a giant markdown file. I like reading human-written documentation, but AI-slop documentation is so tedious I just bounce right off. Plus, when someone wrote the documentation, I can ask the author about details and they'll probably know since they had enough domain expertise and knowledge of the code to explain anything that might be missing. I can't trust you to know anything about the code you had an AI generate and then had an AI write documentation for. Then there's the accuracy issue. Any documentation can always be inaccurate and it can obviously get outdated with time, but at least with human-authored documentation, I can be confident that the content at some point matched a person's best understanding of the topic. With AI, no understanding is involved; it's just probabilistically generated text, we've all hopefully seen LLMs generate plausible-sounding but completely wrong text enough to somewhat doubt their output. | ||||||||||||||||||||||||||
| ▲ | brookst 4 hours ago | parent [-] | |||||||||||||||||||||||||
Classic perfect/good. The choice is not usually “have humans write amazing top notch documentation, or use an LLM”. The choice is usually “have sparse, incomplete, out-of-date documentation… or use an LLM”. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||