▲ | IX-103 4 days ago | ||||||||||||||||
> But they're not just "stochastic parrots." They can model the world and reason about it, albeit imperfectly and not like humans do. I've not seen anything from a model to persuade me they're not just stochastic parrots. Maybe I just have higher expectations of stochastic parrots than you do. I agree with you that AI will have a big impact. We're talking about somewhere between "invention of the internet" and "invention of language" levels of impact, but it's going to take a couple of decades for this to ripple through the economy. | |||||||||||||||||
▲ | libraryofbabel 4 days ago | parent | next [-] | ||||||||||||||||
What is your definition of "stochastic parrot"? Mine is something along the lines of "produces probabilistic completions of language/tokens without having any meaningful internal representation of the concepts underlying the language/tokens." Early LLMs were like that. That's not what they are now. An LLM got Gold on the Mathematical Olympiad - very difficult math problems that it hadn't seen in advance. You don't do that without some kind of working internal model of mathematics. There is just no way you can get to the right answer by spouting out plausible-sounding sentence completions without understanding what they mean. (If you don't believe me, have a look at the questions.) | |||||||||||||||||
| |||||||||||||||||
▲ | app134 4 days ago | parent | prev | next [-] | ||||||||||||||||
In-context learning is proof that LLMs are not stochastic parrots. | |||||||||||||||||
▲ | nuancebydefault 4 days ago | parent | prev [-] | ||||||||||||||||
Stochastic parrot here (or not?). Can you tell the difference? |