Remix.run Logo
linsomniac 3 hours ago

Yes, definitely! The AI tooling works much like a human: it works better if you have a solid specification in place before you start coding. My best successes have been using a design document with clear steps and phases, usually the AI creates and reviews that as well and I eyeball it.

Lots of people are using PRD files for this. https://www.atlassian.com/agile/product-management/requireme...

I've been using checklists and asking it to check off items as it works.

Another nice feature of using these specs is that you can give the AI tools multiple kicks at the can and see which one you like the most, or have multiple tools work on competing implementations, or have better tools rebuild them a few months down the line.

So I might have a spec that starts off:

    #### Project Setup

    - [ ] Create new module structure (`client.py`, `config.py`, `output.py`, `errors.py`)
    - [ ] Move `ApiClient` class to `client.py`
    - [ ] Add PyYAML dependency to `pyproject.toml`
    - [ ] Update package metadata in `pyproject.toml`
And then I just iterate with a prompt like:

    Please continue implementing the software described in the file "dashcli.md".  Please implement the software one phase at a time, using the checkboxes to keep track of what has already been implemented.  Checkboxes ("[ ]") that are checked ("[X]") are done.  When complete, please do a git commit of the changes.  Then run the skill "codex-review" to review the changes and address any findings or questions/concerns it raises.  When complete, please commit that review changeset.  Please make sure to implement tests, and use tests of both the backend and frontend to ensure correctness after making code changes.