| |
| ▲ | djray 13 hours ago | parent | next [-] | | There seems to be a mistaken thought that having an AI (or indeed someone else) help you achieve a task means you aren't learning anything. This is reductionist. I suggest instead that it's about degrees of autonomy. The person you're responding to made a choice to get the AI to help integrate a library. They chose NOT to have the AI edit the files itself; they rather spent time reading through the changes and understanding the integration points, and tweaking the code to make it their own. This is much different to vibe coding. I do a similar loop with my use of AI - I will upload code to Gemini 2.5 Pro, talk through options and assumptions, and maybe get it to write some or all of the next step, or to try out different approaches to a refactor. Integrating any code back into the original source is never copy-and-paste, and that's where the learning is. For example, I added Dexie (a library/wrapper for accessing IndexedDB) to a browser extension project the other day, and the AI helped me get started with a minimal amount of initial knowledge, yet I learned a lot about Dexie and have been able to expand upon the code myself since. If I were on my own, I would probably have barrelled ahead and just used IndexedDB directly, resulting in a lot more boilerplate code and time spent doing busywork. It's this sort of friction reduction that I find most liberating about AI. Trying out a new library isn't a multi-hour slog; instead, you can sample it and possibly reject it as unsuitable almost immediately without having to waste a lot of time on R&D. In my case, I didn't learn 'raw' IndexedDB, but instead I got the job done with a library offering a more suitable level of abstraction, and saved hours in the process. This isn't lazy or giving up the opportunity to learn, it's simply optimising your time. The "not invented here" syndrome is something I kindly suggest you examine, as you may find you are actually limiting your own innovation by rejecting everything that you can't do yourself. | | |
| ▲ | shaky-carrousel 13 hours ago | parent | next [-] | | It's not reductionist, it's a fact. If you, instead of learning Python, ask an LLM to code you something in Python, you won't learn a line of Python in the process. Even if you read the produced code from beginning to end. Because (and honestly I'm surprised I have to point out this, here of all places) you learn by writing code, not by reading code. | | |
| ▲ | rybosome 12 hours ago | parent [-] | | I encourage you to try this yourself and see how you feel. Recently I used an LLM to help me build a small application in Rust, having never used it before (though I had a few years of high performance C++ experience). The LLM wrote most of the code, but it was no more than ~100 lines at a time, then I’d tweak, insert, commit, plan the next feature. I hand-wrote very little, but I was extremely involved in the design and layout of the app. Without question, I learned a lot about Rust. I used tokio’s async runtime, their mpsc channels, and streams to make a high performance crawler that worked really well for my use case. If I needed to write Rust without an LLM now, I believe I could do it - though it would be slower and harder. There’s definitely a “turn my brain off and LLM for me” way to use these tools, but it is reductive to state that ALL usage of such tools is like this. | | |
| ▲ | shaky-carrousel 12 hours ago | parent [-] | | Of course you have learned a lot about rust. What you haven't learned is to program in rust. Try, a month from now, to write that application in rust from scratch, without any LLM help. If you can, then you truly learned to program in rust. If you don't, then what you learned is just generic trivia about rust. |
|
| |
| ▲ | bluefirebrand 11 hours ago | parent | prev [-] | | > The "not invented here" syndrome is something I kindly suggest you examine I think AI is leading to a different problem. The "nothing invented here" syndrome Using LLMs is not the same as offloading the understanding of some code to external library maintainers. It is offloading the understanding of your own code, the code you are supposed to be the steward of, to the LLM |
| |
| ▲ | sekai 12 hours ago | parent | prev | next [-] | | > Did you write your own letters? Did you write your own arguments? Did you write your own code? I do, and don't depend on systems other built to do so. And losing the ability of keep doing so is a pretty big trade-off, in my opinion. Gatekeeping at it's finest, you're not a "true" software engineer if you're not editing the kernel on your own, locked in in a cubicle, with no external help. | | |
| ▲ | shaky-carrousel 12 hours ago | parent [-] | | That... Doesn't even begin to make sense. Defending the ability to code without relying on three big corps is... absolutely unrelated with gate-keeping. |
| |
| ▲ | danenania 11 hours ago | parent | prev [-] | | Unless you're writing machine code, you aren't really writing your own code either. You're giving high level instructions, which depend on many complex systems built by thousands of engineers to actually run. | | |
| ▲ | shaky-carrousel 5 hours ago | parent [-] | | Yes, and my computer is using electricity I'm not directly generating with a bike, but all that is besides the point. | | |
| ▲ | danenania 3 hours ago | parent [-] | | Yeah you depend on many layers of infrastructure as well for that electricity. It’s exactly the point. All the criticisms you level at people coding with LLMs apply just as much to your artisanal hand-sculpted code that you’re so proud of. |
|
|
|