Remix.run Logo
thunky a day ago

> its not looking back at what was just generated

It is, though. The LLM gets the full history in every prompt until you start a new session. That's why it gets slower as the conversation/context gets big.

The developer could choose to rewrite or edit the history before sending it back to the LLM but the user typically can't.

> There's no guarantee everything stays the same except the mistake

Sure, but there's no guarantee about anything it will generate. But that's a separate issue.