| ▲ | nicksergeant 3 hours ago | |
Agree with pretty much everything you wrote here, I guess with the addendum that LLMs can be a part of the learning experience you're describing. It's as easy as telling the LLM "don't write a single line of code nor command, I want to do everything, your goal is to help me understand what we're doing here." There are always going to be people who just want the end result. The only difference now is that LLM tools allow them to get much closer to the end result than they previously were able to. And on the other side, there are always going to be people who want to _understand_ what's happening, and LLMs can help accelerate that. I use LLMs as a personalized guide to learning new things. | ||
| ▲ | tpdly 44 minutes ago | parent | next [-] | |
I know it sounds extreme to dismiss that workflow, but I don't think people are talking enough about the subtle psychological consequences of LLM writing for this kind of thing. In the same way that googling for an SEO article's superficial answer ends up meaning you never really bother to memorize it, "ask chat" seems to lead to never really bothering to think hard about it. Of course I google things, but maybe I should be trying to learn in a way that minimizes the need. Maybe its important to learn how to learn in way that minimizes exposure to sycophantic average-blog-speak. | ||
| ▲ | agentultra an hour ago | parent | prev [-] | |
Best of luck in your journey! To those reading this thread though, be wary of the answers LLMs generate: they're plausible sounding and the LLM's are designed to be sycophants. Be wary, double check their answers to your queries against credible sources. And read the source! | ||