Remix.run Logo
jennyholzer 5 days ago

I think developers who use "AI" coding assistants are putting their careers at risk.

dguest 5 days ago | parent | next [-]

And here I'm wondering if I'm putting my career at risk by not trying them out.

Probably both are true: you should try them out and then use them where they are useful, not for everything.

5 days ago | parent | next [-]
[deleted]
Taek 5 days ago | parent | prev [-]

HN is full of people who say LLMs aren't good at coding and don't "really" produce productivity gains.

None of my professional life reflects that whatsoever. When used well, LLMs are exceptional and putting out large amounts of code of sufficient quality. My peers have switched entire engineering departments to LLM-first development and are reporting that the whole org is moving 2x as fast even after they fired the 50% of devs who couldn't make the switch and didn't hire replacements.

If you think LLM coding is a fad, your head is in the sand.

bgwalter 5 days ago | parent | next [-]

The instigators say they were correct and fired the political opponents. Unheard of!

I have no doubt that volumes of code are being generated and LGTM'd.

mooxie 5 days ago | parent | prev | next [-]

Agreed. I work for a tiny startup where I wear multiple hats, and one of them is DevOps. I manage our cloud infra with Terraform, and anyone who's scaled cloud infrastructure out of the <10 head count company to a successful 500+ company knows how critical it can be to get a wrangle on the infrastructure early. It's basically now or never.

It used to take me days or even multiple sprints to complete large-scale infrastructure projects, largely because of having to repeatedly reference Terraform cloud provider docs for every step along the way.

Now I use Claude Code daily. I use an .md to describe what I want in as much detail as possible and with whatever idiosyncrasies or caveats I know are important from a career of doing this stuff, and then I go make coffee and come back to 99% working code (sometimes there are syntax errors due to provider / API updates).

I love learning, and I love coding. But I am hired to get things done, and to succeed (both personally and in my role, which is directly tied to our organization's security, compliance, and scalability) I can't spend two weeks on my pet projects for self-edification. I also have to worry about the million things that Claude CAN'T do for me yet, so whatever it can take off of my plate is priceless.

I say the same things to my non-tech friends: don't worry about it 'coming for your job' yet - just consider that your output and perceived worth as an employee could benefit greatly from it. If it comes down to two awesome people but one can produce even 2x the amount of work using AI, the choice is obvious.

010101010101 5 days ago | parent | prev | next [-]

Yesterday I used Warp’s LLM integrations to write two shell scripts that would have taken me longer to author myself than to do the task manually. Of the three options, this was the fastest by a wide margin.

For this kind of low stakes, easily verifiable task it’s hard to argue against using LLMs for me.

dguest 5 days ago | parent | prev [-]

Right now I'm mostly an "admin" coder: I look at merge requests and tell people how to fix stuff. I point them to LLMs a lot too. People I know who are actually writing a lot of code are usually saying LLMs are nice.

010101010101 5 days ago | parent | prev | next [-]

Developers who don’t understand how the most basic aspects of systems they work on function are a dime a dozen already, I’m not sure LLMs change the scale of that problem.

baq 5 days ago | parent | prev | next [-]

fighter jet pilots who use the ejection seat are putting their careers at risk, but so are the ones who don't use it when they should.

bookofjoe 5 days ago | parent [-]

>F-35 pilot held 50-minute airborne conference call with engineers before fighter jet crashed in Alaska

https://edition.cnn.com/2025/08/27/us/alaska-f-35-crash-acci...

flanked-evergl 5 days ago | parent | prev | next [-]

The future is increased productivity. If someone can outproduce you if they use AI, then they will take your job.

tmcb 5 days ago | parent | next [-]

This is industrial-grade FOMO. They will take the jobs of the first handful of people. The moment it is obvious that LLMs are a productivity booster, people will learn how to use it, just like it happened with any other technology before.

boesboes 5 days ago | parent | prev | next [-]

After working with claude code for a few months, I am not worried.

falcor84 5 days ago | parent [-]

What does that mean? If you're still paying for a Claude Code, you are supposedly getting increased productivity, right? Or otherwise, why are you still using it?

lexandstuff 5 days ago | parent [-]

I find it useful. A nice little tool in the toolkit: saves a bunch of typing, helps to over come inertia, helps me find things in unfamiliar parts of the codebase, amongst other things.

But for it to be useful, you have to already know what you're doing. You need to tell it where to look. Review what it does carefully. Also, sometimes I find particular hairy bits of code need to be written completely by hand, so I can fully internalise the problem. Only once I've internalised hard parts of codebase can I effectively guide CC. Plus there's so many other things in my day-to-day where next token predictors are just not useful.

In short, its useful but no one's losing a job because it exists. Also, the idea of having non-experts manage software systems at any moderate and above level of complexity is still laughable.

falcor84 5 days ago | parent [-]

I don't think the concern is that non-experts would manage large software systems, but that experts would use it to manage larger software systems on their own before needing to hire additional devs, and in that way reduce the number of available roles. I.e. it increases the "pain threshold" before I would say to myself "it's worth the hassle to hire and onboard another dev to help with this".

hackable_sand 5 days ago | parent | prev [-]

Blink twice if your employer is abusing you

falcor84 5 days ago | parent | prev | next [-]

I would say that the careers of everyone who views themselves as writing code for a living are already at great risk. So if you're in that situation, you have to see how to go up (or down) the ladder of abstraction, and getting comfortable with using GenAI is possibly a good way to do that.

unethical_ban 5 days ago | parent | prev | next [-]

Were accountants that adopted Excel foolish?

Like any new tool that automates a human process, humans must still learn the manual process to understand the skill.

Students should still learn to write all their code manually and build things from the ground up before learning to use AI as an assistant.

micromacrofoot 5 days ago | parent | prev [-]

everyone's also telling us that if we don't use AI we're putting our careers at risk, and that AI will eventually take our jobs

personally I think everyone should shut up