Remix.run Logo
PaulRobinson 9 hours ago

You need to see the comment I was replying to, in order to understand the context I was making.

LLMs are part of what I was thinking of, but not the totality.

We're pretty close to Generative AI - and by that I don't just mean LLMs, but the entire space - being able to use formal notations and abstractions more usefully and correctly, and therefore improve reasoning.

The comment I was replying to complained about this shifting value away from fundamentals and this being a tragedy. My point is that this is just human progress. It's what we do. You buy a microwave, you don't build one yourself. You use a calculator app on your phone, you don't work out the fundamentals of multiplication and division from first principles when you're working out how to split the bill at dinner.

I agree with your general take on all of this, but I'd add that AI will get to the point where it can express "thoughts" in formal language, and then provide appropriate tools to get the job done, and that's fine.

I might not understand Japanese culture without knowledge of Nihongo, but if I'm trying to get across Tokyo in rush hour traffic and don't know how to, do I need to understand Japanese culture, or do I need a tool to help me get my objective done?

If I care deeply about understanding Japanese culture, I will want to dive deep. And I should. But for many people, that's not their thing, and we can't all dive deep on everything, so having tools that do that for us better than existing tools is useful. That's my point: abstractions and tools allow people to get stuff done that ultimately leads to better tools and better abstractions, and so on. Complaining that people don't have a first principle grasp of everything isn't useful.