Remix.run Logo
swhitt 4 hours ago

I’m pretty sure this is because they don’t want Gemini saying things like, “based on my stored context from our previous chat, you said you were highly proficient in Alembic.”

It’s hard to get a principled autocomplete system like these to behave consistently. Take a look at Claude’s latest memory-system prompt for how it handles user memory.

https://x.com/kumabwari/status/1986588697245196348

CGMthrowaway 3 hours ago | parent | next [-]

Yeah but what if you explicitly ask it, "what/how do you know about my stored context"? Why should it be instructed to lie then?

roywiggins 3 hours ago | parent [-]

It could be that the instruction was vague enough ("never mention user_context unless the user brings it up", eg) and since the user never mentioned "context", the model treated it as not having been, technically speaking, mentioned.

dguest 3 hours ago | parent | prev | next [-]

I agree, this might just be an interface design decision.

Maybe telling it not to talk about internal data structures was the easiest way to give it a generic "human" nature, and also to avoid users explicitly asking about internal details.

It's also possible that this is a simple way to introduce "tact": imagine asking something with others present and having it respond "well you have a history of suicidal thoughts and are considering breaking up with your partner...". In general, when you don't know who is listening, don't bring up previous conversations.

Vanit 2 hours ago | parent [-]

The tact aspect seems like a real possibility. In a world where users are likely to cut&paste responses it can't really be sprinkling in references like this.

m463 3 hours ago | parent | prev [-]

Gemini, where is Tolfdir's Alembic?