Remix.run Logo
SkyPuncher 13 hours ago

Yes. I've recently become a convert.

For me, it's less about being able to look back -800k tokens. It's about being able to flow a conversation for a lot longer without forcing compaction. Generally, I really only need the most recent ~50k tokens, but having the old context sitting around is helpful.

hombre_fatal 13 hours ago | parent [-]

Also, when you hit compaction at 200k tokens, that was probably when things were just getting good. The plan was in its final stage. The context had the hard-fought nuances discovered in the final moment. Or the agent just discovered some tiny important details after a crazy 100k token deep dive or flailing death cycle.

Now you have to compact and you don’t know what will survive. And the built-in UI doesn’t give you good tools like deleting old messages to free up space.

I’ll appreciate the 1M token breathing room.

roygbiv2 13 hours ago | parent [-]

I've found compactation kills the whole thing. Important debug steps completely missing and the AI loops back round thinking it's found a solution when we've already done that step.

s900mhz 11 hours ago | parent | next [-]

I find it useful to make Claude track the debugging session with a markdown file. It’s like a persistent memory for a long session over many context windows.

Or make a subagent do the debugging and let the main agent orchestrate it over many subagent sessions.

roygbiv2 10 hours ago | parent [-]

Yeah I use a markdown to put progress in. It gets kinda long and convoluted a manual intervention is required every so often. Works though.

garciasn 12 hours ago | parent | prev | next [-]

For me, Claude was like that until about 2m ago. Now it rarely gets dumb after compaction like it did before.

8note 12 hours ago | parent [-]

oh, ive found that something about compaction has been dropping everything that might be useful. exact opposite experience

myrak 12 hours ago | parent | prev [-]

[dead]