| ▲ | SpicyLemonZest 14 hours ago | |
That seems like an unreasonably high standard. I like to think that I have memory, agency, and self awareness, but I'm not ready to run my organization without further input from anyone. > Do you realize that "memory" requires eating your hilariously small context window? I do! LLMs are structured differently than humans, so the component we call "memory" corresponds to what humans call "short-term memory"; practical long-term memory for an LLM looks much more like what a human would call "let me write this down". But you can and commercially available systems do load it into context on demand when it's needed for some problem or another. | ||
| ▲ | operatingthetan 14 hours ago | parent [-] | |
>memory, agency, and self awareness The LLM only currently has the illusion of these things. Hence the bubble. I know that you (or anyone) as a human being don't have the illusion of these things. This is not like the car replacing the horse for transportation. The LLM as-is cannot fundamentally replace the person. They require the agency of a human to take turns at all, and even more so to enact change in the world. Your LLM does not actively engage in the world because it does not experience anything. It only responds to queries. We can do a lot with that, but it's not intelligence. It can't say oh hey SpicyLemonZest, I was thinking and had an idea the other day. Because it has nothing between each query. | ||