| ▲ | furyofantares 3 hours ago | |||||||
There isn't a strict definition of AGI, there's no way to find evidence for what equates to it, and besides, things like this are meant only as likely necessary conditions. Anyway, from the article: > As long as there is a gap between AI and human learning, we do not have AGI. This seems like a reasonable requirement. Something I think about a lot with vibe coding is that unlike humans, individual models do not get better within a codebase over time, they get worse. | ||||||||
| ▲ | fragmede 3 hours ago | parent [-] | |||||||
Is that within a codebase off relatively fixed size that things get worse as time goes on, or are you saying as the codebase grows that the limits of a model's context means that because the model is no longer able to hold the entire codebase within its context that it performs worse than when the codebase was smaller? | ||||||||
| ||||||||