▲ | gf000 7 days ago | |||||||
Well, according to the recently linked Naur paper, the mental model for a codebase includes just as much what code wasn't written, as much what was - e.g. a decision to do this design over another, etc. This is not recoverable by AI without every meeting note and interaction between the devs/clients/etc. | ||||||||
▲ | lordnacho 7 days ago | parent [-] | |||||||
Not for an old project, but if you've talked AI through building something, you've also told it "nah let's not change the interface" and similar decisions, which will sit in the context. | ||||||||
|