Remix.run Logo
keiferski 3 days ago

I am sympathetic to memory-focused tools like Anki and Zettelkasten (haven't used the latter myself, though) but I think this post is a bit oversimplified.

I think there are at least two models of work that require knowledge:

1. Work when you need to be able to refer to everything instantly. I don't know if this is actually necessary for most scenarios other than live debates, or some form of hyper-productivity in which you need to have extremely high-quality results near-instantaneously.

(HN comments are, amusingly, also an example – comments that are in-depth but come days later aren't relevant. So if you want to make a comment that references a wide variety of knowledge, you'll probably need to already know it, in toto.)

2. Work when you need to "know a small piece of what you don't remember as a whole", or in other terms, know the map, but not necessarily the entire territory. This is essentially most knowledge work: research, writing, and other tasks that require you to create output, but that output doesn't need to be right now, like in a debate.

For example, you can know that X person say something important about Y topic, but not need to know precisely what it was – just look it up later. However, you do still need to know what you're looking for, which is a kind of reference knowledge.

--

What is actually new lately, in my experience, is that AI tools are a huge help for situations where you don't have either Type 1 or Type 2 knowledge of something, and only have a kind of vague sense of the thing you're looking for.

Google and traditional search engines are functionally useless for this, but asking ChatGPT a question like, "I am looking for people that said something like XYZ." This previously required someone to have asked the exact same question on Reddit/a forum, but now you can get a pretty good answer from AI.

throwway120385 3 days ago | parent | next [-]

The AI can also give you pretty good examples of "kind" that you can then evaluate. I've had it find companies that "do X" and then used those companies to understand enough about what I am or am not looking for to research it myself using a search engine. The last time I did this I didn't end up surfacing any of what the AI provided. It's more like talking to the guy in the next cubicle, hearing some suggestions from them, and using those suggestions to form my own opinion about what's important and digging in on that. You do still have to do the work of forming an opinion. The ML model is just much better at recognizing relationships between different words and between features of a category of statements, and in my case they were statements that companies in a particular field tended to make on their websites.

palata 2 days ago | parent | prev | next [-]

> What is actually new lately, in my experience, is that AI tools are a huge help for situations where you don't have either Type 1 or Type 2 knowledge of something

IMO, this is the whole point of the article: AI tools "help" a lot when we are completely uninformed. But in doing that, they prevent us from getting informed in the first place. Which is counter-productive in the long term.

I like to say that learning goes in iterations:

* First you accept new material (the teacher shows some mathematical concept and proves that it works). It convinces you that it makes sense, but you don't know enough to actually be sure that the proof was 100% correct.

* Then you try to apply it, with whatever you could memorise from the previous step. It looked easy when the teacher did it, but when you do it yourself it raises new questions. But while doing this, you memorise it. Being able to say "I can do this exercise, but in this other one there is this difference and I'm stuck" means that you have memorised something.

* Now that you have memorised more, you can go back to the material, and try to convince yourself that you now see how to solve that exercise you were stuck with.

* etc.

It's a loop of something like "accept, understand, memorise, use". If, instead, you prompt until the AI gives you the right answer, you're not learning much.

chain030 2 days ago | parent [-]

"IMO, this is the whole point of the article: AI tools "help" a lot when we are completely uninformed. But in doing that, they prevent us from getting informed in the first place. Which is counter-productive in the long term."

Great way of framing it - simple and cuts straight to the heart of the issue.

rzzzt 3 days ago | parent | prev | next [-]

Pilots have both checklists that they can follow without memorizing, but also memory items that have to be performed almost instinctively if they encounter the precondition events.

skybrian 3 days ago | parent | prev [-]

Live performance (like conversation or playing music) often relies on memory to do it well.

That might be a good criteria for how much to memorize: do you want to be able to do it live?