▲ | samr71 2 days ago | |||||||
Garry Marcus constantly repeats the line that "deep learning has hit a wall!1!" - he was saying this pre-ChatGPT even! It's very easy to dunk on him for this. That said, his willingness to push back against orthodoxy means he's occasionally right. Scaling really has seemed to plateau since GPT-3.5, Hallucinations are still a problem that are perhaps unsolvable under the current paradigm, LLMs do seem to have problems with things far outside their training data. Basically, while listening to Gary Marcus, you will hear a lot of nonsense, it will probably give you a better picture of reality if you can sort the wheat from the chaff. Listening to only Sam Altman, or other AI Hypelords, you'll think the Singularity is right around the corner. Listen to Gary Marcus, you won't. Sam Altman has been substantially more correct on average than Gary Marcus, but I believe Marcus is right that the Singularity narrative is bogus. | ||||||||
▲ | unclebucknasty 2 days ago | parent | next [-] | |||||||
>Sam Altman has been substantially more correct on average than Gary Marcus I've seen some of Marcus' other writing and he's definitely a colorful dude. But is Altman really right more often/substantively? Actually, the comparison shouldn't be to Altman but to the AI hype train in general. And, while I might have missed some of Marcus's writing on specific points, on the broader themes he seems to be effectively exposing the AI hype. | ||||||||
▲ | garymarcus a day ago | parent | prev [-] | |||||||
you obviously never actually read the paper; you should. | ||||||||
|