Remix.run Logo
jcalvinowens 5 hours ago

Based on a lot of real world experience, I'm convinced LLM-generated documentation is worse than nothing. It's a complete waste of everybody's time.

The number of people who I see having E-mail conversations where person A uses an LLM to turn two sentences into ten paragraphs, and person B uses an LLM to summarize the ten paragraphs into two sentences, is becoming genuinely alarming to me.

saulpw 3 minutes ago | parent | next [-]

LLM-generated documentation is great for LLMs to read so they can code better and/or more efficiently. You can write it manually, but as I've discovered over the decades, humans rarely read documentation anyway. So you'll be spending a lot of time writing good for the bots.

ryandrake 5 hours ago | parent | prev | next [-]

> The number of people who I see having E-mail conversations where person A uses an LLM to turn two sentences into ten paragraphs, and person B uses an LLM to summarize the ten paragraphs into two sentences, is becoming genuinely alarming to me.

I remember in the early days of LLMs this was the joke meme. But, now seeing it happen in real life is more than just alarming. It's ridiculous. It's like the opposite of compressing a payload over the wire: We're taking our output, expanding it, transmitting it over the wire, and then compressing it again for input. Why do we do this?

zahlman 3 minutes ago | parent [-]

> But, now seeing it happen in real life is more than just alarming. It's ridiculous. It's like the opposite of compressing a payload over the wire: We're taking our output, expanding it, transmitting it over the wire, and then compressing it again for input. Why do we do this?

I assume this is satire.

claudiulodro 4 hours ago | parent | prev | next [-]

> Based on a lot of real world experience, I'm convinced LLM-generated documentation is worse than nothing. It's a complete waste of everybody's time.

I had a similar realization. My team was discussing whether we should hook our open-source codebases into an AI to generate documentation for other developers, and someone said "why can't they just generate documentation for it themselves with AI"? It's a good point: what value would our AI-generated documentation provide that theirs wouldn't?

saratogacx 8 minutes ago | parent [-]

It isn't valuable if you generate and toss it over the fence. Where the value comes in is when the team verifies the content. Once that's done and corrections made, the words have the assurance that they match the code.

If you aren't willing to put in the time to verify it works than it is indeed, no more useful than anyone else doing the same task on their own.

The_Fox an hour ago | parent | prev [-]

Yesterday my manager sent LLM-generated code that did a thing. Of course I didn't read it, I only read Claude's summary of it. Then I died a little inside.

It was especially unfortunate because to do its thing, the code required a third party's own personal user credentials including MFA, which is a complete non-starter in server-side code, but apparently the manager's LLM wasn't aware enough to know that.