▲ | Lapel2742 8 hours ago | ||||||||||||||||
> you are standing in front of your dishwasher waiting for it to grow arms and do your dishes in the sink. No. I'm standing in front of the dishwasher and the dishwasher expects me to tell it in detail how to wash the dishes. This is not about if you can find any use for a LLM at all. This is about: > LLMs are still surprisingly bad at some simple tasks And yes. They are bad if you have to hand feed them each and every detail for an extremely simple task like comparing two lists. You even have to debug the result because you cannot be sure that the dishwasher really washed the dishes. Maybe it just said it did. | |||||||||||||||||
▲ | Dilettante_ 7 hours ago | parent [-] | ||||||||||||||||
>Hand feed them every detail for an extremely simple task like comparing two lists You believe 57 words are "each and every detail", and that "produce two full, exhaustive lists of items out of your blackbox inner conceptspace/fetch those from the web" are "extremely simple tasks"? Your ignorance of how complex these problems are misleads you into believing there's nothing to it. You are trying to supply an abstraction to a system that requires a concrete. You do not even realize your abstraction is an abstraction. Try learning programming. | |||||||||||||||||
|