| ▲ | cratermoon 2 days ago | |||||||
"Another critical lesson is that humans are distinctly bad at monitoring automated processes". Humans are also distinctly bad at noticing certain kinds of bugs in software. Think off-by-one errors, deadlocks, or any sort of bug you've stared at for days and not noticed the one missing or extra semicolon. But LLMs can generate a tsunami of subtly wrong code in the time a reviewer will notice one typo and miss all the rest. | ||||||||
| ▲ | aphyr 2 days ago | parent | next [-] | |||||||
Yes. For more on this, see section 2: https://aphyr.com/posts/412-the-future-of-everything-is-lies... | ||||||||
| ||||||||
| ▲ | intended 2 days ago | parent | prev [-] | |||||||
> "Another critical lesson is that humans are distinctly bad at monitoring automated processes". I believe the technical term is vigilance degradation? | ||||||||