Remix.run Logo
Incipient 4 hours ago

I still haven't found useful "memory". It's either an agents.md with a high level summary, which is fairly useless for specific details (eg "editing this element needs to mark this other element as a draft") or something detailed and explaining the nitty gritty, which seems to give too much detail such that it gets ignored, or detail from one functional area contaminates the intended changes in another functional area.

The only approach I've found that works is no memory, and manually choosing the context that matters for a given agent session/prompt.

jvwww 3 hours ago | parent | next [-]

Yeah I feel the same way. Wonder when/if we'll get continual learning from these models. I feel like they are smart enough already but their lack of real memory makes them a pain to deal with.

hirako2000 2 hours ago | parent [-]

Google Gemini does this sort of thing. External to the model k presume. And it's very annoying.

A friend told me he would like Claude to remember his personality, which is exactly what Gemini is trying to do.

A machine pretending to be human is disturbing enough. A machine pretending to understand you will spiral very far into spitting out exactly what we want to read.

clutter55561 4 hours ago | parent | prev [-]

All the memories Claude created for me fell in the category remember-to-not-forget, so I disabled it altogether.