| ▲ | zarzavat 3 hours ago | |
Yeah prompting doesn't work for this problem because the entire point of an LLM is you give it the what and it outputs the how. The more how that you have to condition it with in the prompt, the less profitable the interaction will be. A few hints is OK, but doing all the work for the LLM tends to lead to negative productivity. Writing prompts and writing code takes about the same amount of time, for the same amount of text, plus there's the extra time that the LLM takes to accomplish the task, and review time afterwards. So you might as well just write the code yourself if you have to specify every tiny implementation detail in the prompt. | ||
| ▲ | kqr 2 hours ago | parent | next [-] | |
Makes me think of this commitstrip comic: https://i.xkqr.org/itscalledcode.jpg (mirrored from the original due to TLS issues with the original domain.) A guy with a mug comes up to a person standing with their laptop on a small table. The mug guy says, "Some day we won't even need coders any more. We'll be able to just write the specification and the program will write itself." Guy with laptop looks up. "Oh, wow, you're right! We'll be able to write a comprehensive and precise spec and bam, we won't need programmers any more!" Guy with mug takes a sip. "Exactly!" Guy with laptop says, "And do you know the industry term for a project specification that is comprehensive and precise enough to generate a program?" "Uh... no..." "Code. It's called code." | ||
| ▲ | FeepingCreature an hour ago | parent | prev [-] | |
the goal would be to write it a reusable prompt. this is what AGENT.md is for. | ||