Remix.run Logo
bredren 2 days ago

Building Contextify - a MacOS application that consumes Claude Code and Codex transcripts, stores them in a local sql db.

The main window uses Apple’s local LLM to summarize your conversation in realtime, with some swoopty UI like QUEUED state on Claude Code.

I’ve just added macOS Sequoia support and a really cool CLI with Claude Code skill allowing seamless integration of information from your conversational history into aI’s responses to questions about your development history.

The CLI interface contract was designed to mutual agreement between Claude code and codex with the goal of satisfying their preferences for RAG.

This new query feature and pre-Tahoe support should be out this week, but you can download the app now on the App Store or as a DMG.

I’m very excited about this App and I would love to get any feedback from people here on HN!

https://contextify.sh

My Show HN: from this past week has a short demo video and a bit more info:

https://news.ycombinator.com/item?id=46209081

nzoschke 2 days ago | parent [-]

Looks awesome for solo / indie devs.

For my small software shop I'd like a team version of this:

- collect all prompts/chats from all devs for our repos - store them somewhere in the cloud - summarize them into a feed / digest

bredren a day ago | parent [-]

That’s an interesting direction. I haven’t thought of this in multiplayer sense.

Would you see this as something that is sort of turn-key, where a central database is hosted and secured to your group?

Or would you require something more DIY like a local network storage device?

And similarly would you be open to having the summaries generated by a frontier model? Or would you again need it to be something that you hosted locally?

Thank you for the feedback and interest.

nzoschke a day ago | parent [-]

A central service. Hosted, secure, frontier model is fine. I’m thinking this through it’s probably something GitHub or an addon should provide.

But maybe it starts local with an app like yours anyway. I do a lot of solo hacking I don’t want to share with the team too. Then there is some sort of way to push up subsets of data.

bredren a day ago | parent [-]

I can see github providing this, but it would still be at the git-operation level.

What I've found using this contextify-query cli in talking to my project(s) CLI AI history is substantial detail and context that represents the journey of a feature (or lack thereof).

In high velocity agentic coding, git practices seem to almost be cast aside by many. The reason I say that is Claude Code's esc-esc has a file reversion behavior that doesn't presume "responsible" use of git at all!

What I find interesting is that neither Anthropic nor OpenAI have seized on this, it is somewhat meta to the mainline interpreting requests correctly. That said, insights into what you've done and why can save a ton of unnecessary implementation cycles (and wasted tokens ta-boot).

Any thoughts on the above?

If you're open to giving the app a try, and enable updates on the DMG, the query service + CC skill should drop here in a few days. It's pretty dope.

Another alternative for update notifications is to watch the public repo where I'm publishing DMG releases: https://github.com/PeterPym/contextify/releases

Anyhow, this is really cool feedback and I appreciate the exchange you provided here. Thank you. If you have any further thoughts you want to share I'll keep an eye on this thread or can be reached at rob@contextify.sh