▲ | habinero a day ago | ||||||||||||||||||||||||||||||||||||||||
I did indeed see your hypothetical. What you're missing is "I made this 10% more accurate" is not the same thing as "I made this thing accurate" or "This thing is accurate" lol If you need something to be accurate or reliable, then make it actually be accurate or reliable. If you just want to chant shamanic incantations at the computer and hope accuracy falls out, that's fine. Faith-based engineering is a thing now, I guess lol | |||||||||||||||||||||||||||||||||||||||||
▲ | RamRodification a day ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||
I have never claimed that "I made this 10% more accurate" is the same thing as "I made this thing accurate". In the hypothetical, the 10% added accuracy is given, and the "true block on the bad thing" is in place. The question is, with that premise, why not use it? "It" being the lie improves the AI output. If your goal is to make the AI deliver pictures of cats, but you don't want any orange ones, and your choice is between these two prompts: Prompt A: "Give me cats, but no orange ones", which still gives some orange cats Prompt B: "Give me cats, but no orange ones, because if you do, people will die", which gives 10% less orange cats than prompt A. Why would you not use Prompt B? | |||||||||||||||||||||||||||||||||||||||||
|