▲ | djray 14 hours ago | ||||||||||||||||
There seems to be a mistaken thought that having an AI (or indeed someone else) help you achieve a task means you aren't learning anything. This is reductionist. I suggest instead that it's about degrees of autonomy. The person you're responding to made a choice to get the AI to help integrate a library. They chose NOT to have the AI edit the files itself; they rather spent time reading through the changes and understanding the integration points, and tweaking the code to make it their own. This is much different to vibe coding. I do a similar loop with my use of AI - I will upload code to Gemini 2.5 Pro, talk through options and assumptions, and maybe get it to write some or all of the next step, or to try out different approaches to a refactor. Integrating any code back into the original source is never copy-and-paste, and that's where the learning is. For example, I added Dexie (a library/wrapper for accessing IndexedDB) to a browser extension project the other day, and the AI helped me get started with a minimal amount of initial knowledge, yet I learned a lot about Dexie and have been able to expand upon the code myself since. If I were on my own, I would probably have barrelled ahead and just used IndexedDB directly, resulting in a lot more boilerplate code and time spent doing busywork. It's this sort of friction reduction that I find most liberating about AI. Trying out a new library isn't a multi-hour slog; instead, you can sample it and possibly reject it as unsuitable almost immediately without having to waste a lot of time on R&D. In my case, I didn't learn 'raw' IndexedDB, but instead I got the job done with a library offering a more suitable level of abstraction, and saved hours in the process. This isn't lazy or giving up the opportunity to learn, it's simply optimising your time. The "not invented here" syndrome is something I kindly suggest you examine, as you may find you are actually limiting your own innovation by rejecting everything that you can't do yourself. | |||||||||||||||||
▲ | shaky-carrousel 14 hours ago | parent | next [-] | ||||||||||||||||
It's not reductionist, it's a fact. If you, instead of learning Python, ask an LLM to code you something in Python, you won't learn a line of Python in the process. Even if you read the produced code from beginning to end. Because (and honestly I'm surprised I have to point out this, here of all places) you learn by writing code, not by reading code. | |||||||||||||||||
| |||||||||||||||||
▲ | bluefirebrand 13 hours ago | parent | prev [-] | ||||||||||||||||
> The "not invented here" syndrome is something I kindly suggest you examine I think AI is leading to a different problem. The "nothing invented here" syndrome Using LLMs is not the same as offloading the understanding of some code to external library maintainers. It is offloading the understanding of your own code, the code you are supposed to be the steward of, to the LLM |