Remix.run Logo
bredren 3 days ago

Hey Mark, I posted about this in another comment [1] but I also think the LLM is decent, and beyond its quality the scale of distribution is a big deal.

I had pondered practical implementations of the model since it was announced and have just released today a new native macos application that uses it to summarize Claude Code and Codex conversations as they occur. [2]

If you use either of these CLI agents and have time to try the app out and provide feedback, I'd appreciate it! I'm at rob@contextify.sh.

[1] https://news.ycombinator.com/item?id=46209975

[2] https://contextify.sh

mark_l_watson 3 days ago | parent [-]

Good product idea! Nice feature keeping track of Claude Code and codex and doing some syncing. I don’t use Claude code and cancelled my OpenAI subscription a few months ago (I use local models and Gemini) so I am not a possible customer for your product. BTW, I have experimented with storing my own sync info long term for local models - difficult problem.