|
| ▲ | JeremyNT 2 hours ago | parent | next [-] |
| This is a part of it, but I also feel like a Luddite (the historical meaning, not the derogatory slang). I do use these tools, clearly see their potential, and know full well where this is going: capital is devaluing labor. My skills will become worthless. Maybe GP is right that at first only skilled developers can wield them to full effect, but it's obviously not going to stop there. If I could destroy these things - as the Luddites tried - I would do so, but that's obviously impossible. For now I'm forced to use them to stay relevant, and simply hope I can hold on to some kind of employment long enough to retire (or switch careers). |
| |
| ▲ | visarga an hour ago | parent | next [-] | | > know full well where this is going: capital is devaluing labor But now you too can access AI labor. You can use it for yourself directly. | | |
| ▲ | UtopiaPunk 12 minutes ago | parent [-] | | Kind of. But the outcomes likely do not benefit the masses. People "accessing AI labor" is just a race to the bottom. Maybe some new tools get made or small businesses get off the ground, but ultimately this "AI labor" is a machine that is owned by capitalists. They dictate its use, and they will give or deny people access to the machine as it benefits them. Maybe they get the masses dependent on AI tools that are currently either free or underpriced, as alternatives to AI wither away unable to compete on cost, then the prices are raised or the product enshittified. Or maybe AI will be massively useful to the surveillance state and data brokers. Maybe AI will simply replace a large percentage of human labor in large corporations, leading to mass unemployment. I don't fault anyone for trying to find opportunities to provide for themselves and loved ones in this moment by using AI to make a thing. But don't fool yourself into thinking that the AI labor is yours. The capitalists own it, not us. |
| |
| ▲ | jonas21 an hour ago | parent | prev | next [-] | | > If I could destroy these things - as the Luddites tried - I would do so, but that's obviously impossible. Certainly, you must realize how much worse life would be for all of us had the Luddites succeeded. | | |
| ▲ | toprerules 41 minutes ago | parent [-] | | If the human race is wiped out by global warming I'm not so sure I would agree with this statement. Technology rarely fails to have downsides that are only discovered in hindsight IMO. |
| |
| ▲ | Der_Einzige 2 hours ago | parent | prev [-] | | The historical luddites are literally the human death drive externalized. Reject them and all of their garbage ideas with extreme prejudice. Related, the word “meritocracy” was coined in a book which was extremely critical of the whole concept. AI thankfully destroys it. Good riddance, don’t let the door hit your ass on the way out. https://en.wikipedia.org/wiki/The_Rise_of_the_Meritocracy | | |
| ▲ | mbgerring 2 hours ago | parent | next [-] | | You can reject the ideas in the aggregate. Regardless, for the individual, your skills are being devalued, and what used to be a reliable livelihood tied to a real craft is going to disappear within a decade or so. Best of luck | |
| ▲ | takklob 20 minutes ago | parent | prev [-] | | I bet you’re one of the same dumbasses who fell hook, line and sinker for the cold fusion fraud a few years back lmao. |
|
|
|
| ▲ | sho_hn 2 hours ago | parent | prev | next [-] |
| I resonate with that. I also find writing code super pleasurable. It's immediate stress relief for me, I love the focus and the flow. I end long hands-on coding sessions with a giddy high. What I'm finding is that it's possible to integrate AI tools into your workflow in a big way without giving up on doing that, and I think there's a lot to say for a hybrid approach. The result of a fully-engaged brain (which still requires being right in there with the problem) using AI tools is better than the fully-hands-off way touted by some. Stay confident in your abilities and find your mix/work loop. It's also possible to get a certain version of the rewards of coding from instrumenting AI tools. E.g. slicing up and sizing tasks to give to background agents that you can intuit from experience they'll be able to actually hand in a decent result on is similar to structuring/modularization exercises (e.g. with the goal to be readable or maintainable) in writing code, feelings-wise. |
| |
| ▲ | bearfox 4 minutes ago | parent [-] | | I'm in the enjoy writing code camp and do see merits of the hybrid approach, but I also worry about the (mental) costs. I feel that for using AI effectively I need to be fully engaged with both the problem itself and an additional problem of communicating with the LLM - which is more taxing than pre-LLM coding. And if I'm not fully engaged those outcomes usually aren't that great and bring frustration. In isolation, the shift might be acceptable, but in reality I'm still left with a lot of ineffective meetings - only now without coding sessions to clear my brain. |
|
|
| ▲ | jayd16 2 hours ago | parent | prev | next [-] |
| Hope: I want to become a stronger dev. Reality: Promoted to management (of AI) without the raise or clout or the reward of mentoring. |
| |
| ▲ | organsnyder 2 hours ago | parent | next [-] | | > ...the reward of mentoring. I really feel this. Claude is going to forget whatever correction I give it, unless I take the time and effort to codify it in the prompt. And LLMs are going to continue to get better (though the curve feels like it's flattening), regardless of whatever I do to "mentor" my own session. There's no feeling that I'm contributing to the growth of an individual, or the state-of-the-art of the industry. | |
| ▲ | rurp 2 hours ago | parent | prev [-] | | LLMs are similar in a lot of ways to the labor outsourcing that happened a generation or two ago. Except that instead of this development lifting a billion people out of poverty in the third world a handful of rich people will get even more rich and everyone else will have higher energy bills. |
|
|
| ▲ | blibble 2 hours ago | parent | prev | next [-] |
| exactly thankfully I started down the FIRE route 20 years ago and now am more or less continuing to work because I want to which will end for my employer if they insist on making me output generative excrement |
|
| ▲ | icedchai an hour ago | parent | prev | next [-] |
| There's room for both. Give AI the boilerplate, save the exciting stuff for you. |
|
| ▲ | p-t an hour ago | parent | prev | next [-] |
| i agree. it seem like an expectation these days to use AI sometimes... for me i am happy not using it at all, i like to be able to say "I made this" :) |
| |
| ▲ | neilellis 29 minutes ago | parent [-] | | I suppose the question is "Do you feel Steve Jobs made the iPhone?" Not saying right/wrong but it's a useful Rorschach Test - about what you feel defines 'making this'? | | |
| ▲ | p-t 16 minutes ago | parent [-] | | it's more just a personal want to be able to see what I can do on my own tbh; i don't generally judge other people on that measure although i do think Steve Jobs didn't make the iPhone /alone/, and that a lot of other people contributed to that. i'd like to be able to name who helps me and not say "gemini". again, it's more of a personal thing lol |
|
|
|
| ▲ | QuercusMax 33 minutes ago | parent | prev | next [-] |
| I like writing new, interesting code, but learning framework #400 with all its own idiosyncrasies has gotten really old. I just rebuilt a fairly simple personal app that I've been maintaining for my family for nearly 30 years, and had a blast doing with an AI agent - I mostly used Claude Sonnet 4.5. I've been dreading this rebuild mostly because it's so boring; this is an app I built originally when I was 17, and I'm 43 now. I treated Claude basically like I'd treat my 17-year-old self, and I've added a bunch of features that I could never be assed to do before. |
|
| ▲ | an hour ago | parent | prev [-] |
| [deleted] |