▲ | hardwaregeek 3 days ago | |
My mental model for LLMs is that they’re a fuzzy compiler of sorts. Any kind of specification whether that’s BNF or a carefully written prompt will get “translated”. But if you don’t have anything to translate it won’t output anything good. | ||
▲ | gyomu 3 days ago | parent | next [-] | |
> if you don’t have anything to translate it won’t output anything good. One of the greatest quotes in the history of computer science: “On two occasions I have been asked, – "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" ... I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question" | ||
▲ | danielvaughn 3 days ago | parent | prev [-] | |
Yep, exactly. "Garbage in, garbage out" still applies. |