Remix.run Logo
neomantra 7 hours ago

Thanks for sharing this — I appreciate your motivation in the README.

One suggestion, which I have been trying to do myself, is to include a PROMPTS.md file. Since your purpose is sharing and educating, it helps others see what approaches an experienced developer is using, even if you are just figuring it out.

One can use a Claude hook to maintain this deterministically. I instruct in AGENTS.md that they can read but not write it. It’s also been helpful for jumping between LLMs, to give them some background on what you’ve been doing.

antirez 6 hours ago | parent [-]

In this case, instead of a prompt I wrote a specification, but later I had to steer the models for hours. So basically the prompt is the sum of all such interactions: incredibly hard to reconstruct to something meaningful.

wyldfire 6 hours ago | parent | next [-]

I've only just started using it but the ralph wiggum / ralph loop plugin seems like it could be useful here.

If the spec and/or tests are sufficiently detailed maybe you can step back and let it churn until it satisfies the spec.

neomantra 6 hours ago | parent | prev | next [-]

Isn't the "steering" in the form of prompts? You note "Even if the code was generated using AI, my help in steering towards the right design, implementation choices, and correctness has been vital during the development." You are a master of this, let others see how you cook, not just taste the sauce!

I only say this as it seems one of your motivations is education. I'm also noting it for others to consider. Much appreciation either way, thanks for sharing what you did.

enriquto 6 hours ago | parent | prev | next [-]

This steering is the main "source code" of the program that you wrote, isn't it? Why throw it away. It's like deleting the .c once you have obtained the .exe

minimaxir 6 hours ago | parent [-]

It's more noise than signal because it's disorganized, and hard to glean value from it (speaking from experience).

stellalo 6 hours ago | parent | prev [-]

Doesn’t Claude Code allow to just dump entire conversations, with everything that happened in them?

joemazerino 6 hours ago | parent [-]

All sessions are located in the `~/.claude/projects/foldername` subdirectory.

ukuina 6 hours ago | parent [-]

Doesn't it lose prompts prior to the latest compaction?

jitl an hour ago | parent | next [-]

I’ve sent Claude back to look at the transcript file from before compaction. It was pretty bad at it but did eventually recover the prompt and solution from the jsonl file.

onedognight 4 hours ago | parent | prev [-]

It’s loses them in the current context (say 200k tokens), not in its SQLite history db (limited by your local storage).