Remix.run Logo
Betelbuddy 7 hours ago

Its very logical and pretty obvious when you do code generation. If you ask the same model, to generate code by starting with:

- You are a Python Developer... or - You are a Professional Python Developer... or - You are one of the World most renowned Python Experts, with several books written on the subject, and 15 years of experience in creating highly reliable production quality code...

You will notice a clear improvement in the quality of the generated artifacts.

gehsty 34 minutes ago | parent | next [-]

Do you think that Anthropic don’t include things like this in their harness / system prompts? I feel like this kind of prompts are uneccessary with Opus 4.5 onwards, obviously based on my own experience (I used to do this, on switching to opus I stopped and have implemented more complex problems, more successfully).

I am having the most success describing what I want as humanly as possible, describing outcomes clearly, making sure the plan is good and clearing context before implementing.

obiefernandez 7 hours ago | parent | prev | next [-]

My colleague swears by his DHH claude skill https://danieltenner.com/dhh-is-immortal-and-costs-200-m/

haolez 7 hours ago | parent | prev [-]

That's different. You are pulling the model, semantically, closer to the problem domain you want it to attack.

That's very different from "think deeper". I'm just curious about this case in specific :)

argee 4 hours ago | parent [-]

I don't know about some of those "incantations", but it's pretty clear that an LLM can respond to "generate twenty sentences" vs. "generate one word". That means you can indeed coax it into more verbosity ("in great detail"), and that can help align the output by having more relevant context (inserting irrelevant context or something entirely improbable into LLM output and forcing it to continue from there makes it clear how detrimental that can be).

Of course, that doesn't mean it'll definitely be better, but if you're making an LLM chain it seems prudent to preserve whatever info you can at each step.