| ▲ | vagrantstreet 2 hours ago | |||||||
Isn't it strange that we expect them to act like humans even though after a model was trained it remains static? How is this supposed to be even close to "human like" anyway | ||||||||
| ▲ | mettamage 2 hours ago | parent | next [-] | |||||||
> Isn't it strange that we expect them to act like humans even though after a model was trained it remains static? An LLM is more akin to interacting with a quirky human that has anterograde amnesia because it can't form long-term memories anymore, it can only follow you in a long-ish conversation. | ||||||||
| ▲ | LiamPowell 2 hours ago | parent | prev [-] | |||||||
If we could reset a human to a prior state after a conversation then would conversations with them not still be "human like"? I'm not arguing that LLMs are human here, just that your reasoning doesn't make sense. | ||||||||
| ||||||||