Remix.run Logo
zoeysmithe 2 days ago

Yep this. LLMs are just regressive models.

Imagine if we had a LLM in the 15th century. It would happily explain the validity of the geocentric system. It can't get to heliocentrism. The same way modern LLMs can only tell us what we know and cant think, revolutionize, etc. They can be programmed to reason a bit, but 'reason' is doing a lot of heavy lifting here. The reasoning is just a better filter on what the person is asking or what is being produced for the most part and not an actual novel creative act.

The more time I spend with LLM's the more they feel like google on steroids. I just am not seeing how this type of system could ever lead to AGI, and if anything, probably is eating away at any remaining AGI hype and funding.