| ▲ | walrus01 7 hours ago | ||||||||||||||||||||||
For one, everything its 'intelligence' knows about solving the problem is contained within the finite context window memory buffer size for the particular model and session. Unless the memory contents of the context window are being saved to storage and reloaded later, unlike a human, it won't "remember" that it solved the problem and save its work somewhere to be easily referenced later. | |||||||||||||||||||||||
| ▲ | in-silico 6 hours ago | parent | next [-] | ||||||||||||||||||||||
For one, everything humans' "intelligence" knows about solving the problem is contained within the finite brain size for the particular person and life. Unless the memory contents of the brain are being saved to storage and reloaded later, it won't "remember" that it solved the problem and save its work somewhere to be easily referenced in a later life. | |||||||||||||||||||||||
| ▲ | jychang 7 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
There's humans that have memory issues, or full blown Anterograde amnesia. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | resident423 7 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
What your describing sounds more like the model is lacking awareness than lacking intelligence? Why does it need to know it solved the problem to be intelligent? | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | charcircuit 6 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
As another commenter pointed out these models are being trained how to save and read context into files so denying them to use such an ability that they have just makes your claim tautological. | |||||||||||||||||||||||
| ▲ | bpodgursky 6 hours ago | parent | prev [-] | ||||||||||||||||||||||
All modern harnesses write memory files for context later. | |||||||||||||||||||||||