▲ | pcj-github a day ago | |||||||
This resonates with me. I feel like AI is making me learn slower. For example, I am learning Rust, for quite awhile now. While AI has been very helpful in lowering the bar to /begin/ learning Rust, it's making it slower to achieve a working competence with it, because I always seem reliant on the LLM to do the thinking. I think I will have to turn off all the AI and struggle struggle struggle, until I don't, just like the old days. | ||||||||
▲ | imadethis a day ago | parent | next [-] | |||||||
I've found the same effect when I ask the LLM to do the thinking for me. If I say "rewrite this function to use a list comprehension", I don't retain anything. It's akin to looking at Stack Overflow and copying the first result, or going through a tutorial that tells you what to write without ever explaining it. The real power I've found is using it as a tutor for my specific situation. "How do list comprehensions work in Python?" "When would I use a list comprehension?" "What are the performance implications?" Being able to see the answers to these with reference to the code on my screen and in my brain is incredibly useful. It's far easier to relate to the business logic I care about than class Foo and method Bar. Regarding retention, LLMs still doesn't hold a candle to properly studying the problem with (well-written) documentation or educational materials. The responsiveness however makes it a close second for overall utility. ETA: This is regarding coding problems specifically. I've found LLMs fall apart pretty fast on other fields. I was poking at some astrophysics stuff and the answers were nonsensical from the jump. | ||||||||
| ||||||||
▲ | jart a day ago | parent | prev | next [-] | |||||||
Try using the LLM as a learning tool, rather than asking it to do your job. I don't really like the way LLMs code. I like coding. So I mostly do that myself. However I find it enormously useful to be able to ask an LLM questions. You know the sort of question you need to ask to build an intuition for something? Where it's not a clear problem answer type question you could just Google. It's the sort of thing where you'd traditionally have to go hunt down a human being and ask them questions? LLMs are great at that. Like if I want to ask, what's the point of something? An LLM can give me a much better idea than reading its Wikipedia page. This sort of personalized learning experience that LLMs offer, your own private tutor (rather than some junior developer you're managing) is why all the schools that sit kids down with an LLM for two hours a day are crushing it on test scores. It makes sense if you think about it. LLMs are superhuman geniuses in the sense of knowing everything. So use them for their knowledge. But knowing everything is distracting for them and, for performance reasons, LLMs tend to do much less thinking than you do. So any work where effort and focus is what counts the most, you're better off doing that yourself, for now. | ||||||||
▲ | eschaton a day ago | parent | prev | next [-] | |||||||
Why are you using an LLM at all when it’ll both hamper your learning and be wrong? | ||||||||
| ||||||||
▲ | whatnow37373 19 hours ago | parent | prev | next [-] | |||||||
The world will slowly, slowly converge on this but not before many years of hyping and preaching about how this shit is the best thing since sliced bread and shoving it into our faces all day long, but in the meantime I suggest we be mindful of our AI usage and keep our minds sharp. We might be the only ones left after a decade or two of this. | ||||||||
▲ | neevans 18 hours ago | parent | prev [-] | |||||||
Nah you are getting it wrong the issue here is YOU NO LONGER NEED TO LEARN RUST thats why you are learning it slow. | ||||||||
|