▲ | palata 2 days ago | |
> What is actually new lately, in my experience, is that AI tools are a huge help for situations where you don't have either Type 1 or Type 2 knowledge of something IMO, this is the whole point of the article: AI tools "help" a lot when we are completely uninformed. But in doing that, they prevent us from getting informed in the first place. Which is counter-productive in the long term. I like to say that learning goes in iterations: * First you accept new material (the teacher shows some mathematical concept and proves that it works). It convinces you that it makes sense, but you don't know enough to actually be sure that the proof was 100% correct. * Then you try to apply it, with whatever you could memorise from the previous step. It looked easy when the teacher did it, but when you do it yourself it raises new questions. But while doing this, you memorise it. Being able to say "I can do this exercise, but in this other one there is this difference and I'm stuck" means that you have memorised something. * Now that you have memorised more, you can go back to the material, and try to convince yourself that you now see how to solve that exercise you were stuck with. * etc. It's a loop of something like "accept, understand, memorise, use". If, instead, you prompt until the AI gives you the right answer, you're not learning much. | ||
▲ | chain030 2 days ago | parent [-] | |
"IMO, this is the whole point of the article: AI tools "help" a lot when we are completely uninformed. But in doing that, they prevent us from getting informed in the first place. Which is counter-productive in the long term." Great way of framing it - simple and cuts straight to the heart of the issue. |