▲ | SalmoShalazar 3 days ago | |
The forgone conclusion that LLMs are the key or even a major step towards AGI is frustrating. They are not, and we are fooling ourselves. They are incredible knowledge stores and statistical machines, but general intelligence is far more than these attributes. | ||
▲ | quectophoton 3 days ago | parent | next [-] | |
My thoughts are that LLMs are like cooking a chicken by slapping it: yes, it works, but you need to reach a certain amount of kinetic energy (the same way LLMs only "start working" after reaching a certain size). So then, if we can cook a chicken like this, we can also heat a whole house like this during winters, right? We just need a chicken-slapper that's even bigger and even faster, and slap the whole house to heat it up. There's probably better analogies (because I know people will nitpick that we knew about fire way before kinetic energy), so maybe AI="flight by inventing machines with flapping wings" and AGI="space travel with machines that flap wings even faster". But the house-sized chicken-slapper illustrates how I view the current trend of trying to reach AGI by scaling up LLMs. | ||
▲ | jibal 3 days ago | parent | prev [-] | |
Right ... as the article lays out. |