| ▲ | munk-a 6 hours ago |
| > Stop trying to use it as all-or-nothing. You can still make the decisions, call the shots, write code where AI doesn't help and then use AI to speed up parts where it does help. You're assuming that finding the places where AI needs help isn't already a larger task than just writing it yourself. AI can be helpful in development in very limited scenarios but the main thrust of the comment above yours is that it takes longer to read and understand code than to write it and AI tooling is currently focused on writing code. We're optimizing the easy part at the expense of the difficult part - in many cases it simply isn't worth the trouble (cases where it is helpful, imo, exist when AI is helping with code comprehension but not new code production). |
|
| ▲ | Aurornis 6 hours ago | parent | next [-] |
| > You're assuming that finding the places where AI needs help isn't already a larger task than just writing it yourself. Not assuming anything, I'm well versed in how to do this. Anyone who defers to having AI write massive blocks of code they don't understand is going to run into this. You have to understand what you want and guide the AI to write it. The AI types faster than me. I can have the idea and understand and then tell the LLM to rearrange the code or do the boring work faster than I can type it. |
| |
| ▲ | Exoristos 6 hours ago | parent | next [-] | | The number of devs I've worked with who can't touch-type and don't use or know their way around a proper IDE is depressingly large. | | |
| ▲ | Aurornis 5 hours ago | parent | next [-] | | Same with debuggers. I run into people with 10 years of experience who are still trying to printf debug complex problems that would be easy with 5 minutes in a debugger. I think we're seeing something similar with AI: There are devs who spend a couple days trying to get AI to magically write all of their code for them and then swear it off forever, thinking they're the only people who see the reality of AI and everyone else is wrong. | | |
| ▲ | munk-a 2 hours ago | parent | next [-] | | At the same time - there are devs that spend two days setting up a debugger for a simple problem that would be easy with five minutes and printf. AI is a tool and it's a useful tool - it's not always the best tool for the job and the real skill is in knowing when you use it and when not to. It's a sort of context of life that the easy problems are solved - those where an extreme answer is always correct are things we no longer even consider problems... most of the options that remain have their advantages and disadvantages so the true answer is somewhere in the middle. | |
| ▲ | hunterpayne an hour ago | parent | prev [-] | | Right, but then the AI doesn't have a positive ROI. In all fairness, it never has a positive ROI but now its much more negative, to the point the accountants will put an end to the experiment after year end reveals how negative it really is. |
| |
| ▲ | throwuxiytayq 2 hours ago | parent | prev [-] | | This isn't about touch typing or IDE tricks. I'm an IDE power user and - reasoning aside - I used to run circles around my peers when it comes to raw code editing efficiency. This is increasingly an obsolete workflow. LLMs can execute codebase-wide refactors in seconds. You can use them as a (foot-)shotgun, or as a surgical tool. | | |
| ▲ | Exoristos 2 hours ago | parent [-] | | So many are masters of AI marketing, it's thinkable one of them has mastered AI. |
|
| |
| ▲ | ryan_n 6 hours ago | parent | prev | next [-] | | You've come full circle and are essentially just describing what the OP was saying in their initial post lol. | |
| ▲ | kakacik 6 hours ago | parent | prev [-] | | If you are trying to sell it, you are doing a poor job and effectively siding with OP while desperately trying to write the opposite. Juniors are mostly better than what you write as behavior, I certainly never had to correct as much after any junior as OP writes. If you have 'boring code' in your codebase, maybe it signals not that great architecture (and I presume we don't speak about some codegens which existed since 90s at least). Also, any senior worth their salt wants to intimately understand their code, the only way you can anyhow guarantee correctness. Man, I could go on and on and pick your statements one by one but that would take long. |
|
|
| ▲ | _puk 5 hours ago | parent | prev [-] |
| The problem I have with this take is it's focused on solving the right now problem. Yes, it's quicker to do it yourself this time, but if we build out the artifacts to do a good enough job this time, next time it'll have all the context it needs to take a good shot at it, and if you get overtaken by AI in the meantime you've got an insane head start. Which side of history are you betting on? |
| |
| ▲ | munk-a 5 hours ago | parent [-] | | I don't believe that investing more of my time in a slower process now would result in an advantage if that other process was refined. I've toyed around with these tools and know enough to get an environment up and running so what would I gain from using them more right now if those tools may significantly change before they're adapted to more efficient usage? I'm okay not being at the bleeding edge - I can see the remains of the companies that aggressively switch to the new best thing. Sometimes it'll pay off and sometimes it won't. I am comfortable being a person that waits until something hits a 2.0 and the advantages and disadvantages are clear before seriously considering a migration. |
|