| ▲ | mettamage 2 hours ago | |
> Isn't it strange that we expect them to act like humans even though after a model was trained it remains static? An LLM is more akin to interacting with a quirky human that has anterograde amnesia because it can't form long-term memories anymore, it can only follow you in a long-ish conversation. | ||