Remix.run Logo
raw_anon_1111 11 hours ago

In my experience, inefficient code is rarely the issue outside of data engineering type ETL jobs. It’s mostly architectural. Inefficient code isn’t the reason your login is taking 30 seconds. Yes I know at Amazon/AWS scale (former employee) every efficiency matters. But even at Salesforce scale, ringing out every bit of efficiency doesn’t matter.

No one cares about handcrafted artisanal code as long as it meets both functional and non functional requirements. The minute geeks get over themselves thinking they are some type of artists, the happier they will be.

I’ve had a job that requires coding for 30 years and before ther I was hobbyist and I’ve worked for from everything from 60 person startups to BigTech.

For my last two projects (consulting) and my current project, while I led the project, got the requirements, designed the architecture from an empty AWS account (yes using IAC) and delivered it. I didn’t look at a line of code. I verified the functional and non functional requirements, wrote the hand off documentation etc.

The customer is happy, my company is happy, and I bet you not a single person will ever look at a line of code I wrote. If they do get a developer to take it over, the developer will be grateful for my detailed AGENTS.md file.

sarchertech 10 hours ago | parent | next [-]

It’s not about hand crafted code or even code performance.

We know from experimentation that agents will change anything that isn’t nailed down. No natural language spec or test suite has ever come close to fully describing all observable behaviors of a non-trivial system.

This means that if no one is reviewing the code, agents adding features will change observable behaviors.

This gets exposed to users as churn, jank, and broken work flows.

raw_anon_1111 9 hours ago | parent [-]

Thats easy enough to prevent with modular code that’s what “plan mode” is for. But you probably never worked with a bunch of C# developers using R#

sarchertech 9 hours ago | parent [-]

1. Preventing agents from crossing boundaries, creating implicit and explicit dependencies, and building false layers requires much more human control over every PR and involvement with the code than you seem to espouse.

2. Assuming that techniques that work with human developers that have severely impaired judgement but are massively faster at producing code is a bad idea.

3. There’s no way you have enough experience with maintaining code written in this way to confidently hand wave away concerns.

raw_anon_1111 9 hours ago | parent [-]

Absolutely no one in the value chain cares about “how many layers of abstractions your code has - not your management or your customers. They care about functional and none functional requirements

sarchertech 6 hours ago | parent [-]

Of course they don’t. Please reread what I said, give it the slightest bit of thought, and re-respond if you want a response from me.

raw_anon_1111 6 hours ago | parent [-]

By definition, coding agents are right now the worse they will ever be and the industry as a whole by definition is the least experienced it will ever be at using then.

So many people on HN are so insulted that the people who put money in our bank accounts and in some cases stock in our brokerage accounts ever cared about their bespoke clean code, GOF patterns and they never did. LLM just made it more apparent.

It’s always been dumb for PR to be focused on for loops vs while loops instead of focusing on whether functional and non functional requirements are met

sarchertech 3 hours ago | parent [-]

Wow you have completely lost the plot. It’s like you’re a bot that’s mixing up who he’s replying to.

raw_anon_1111 3 hours ago | parent [-]

Just maybe you aren’t making the strong argument you think you are making

hard24 11 hours ago | parent | prev | next [-]

"No one cares about handcrafted artisanal code as long as it meets both functional and non functional requirements"

Speak for yourself. I don't hire people like you.

raw_anon_1111 11 hours ago | parent [-]

And guess what? You probably don’t pay as much as I make now either…

Even in late 2023 with the shit show of the current market, I had no issues having multiple offers within three weeks just by reaching out to my network and companies looking for people with my set of skills.

hard24 11 hours ago | parent [-]

I field a small team of experts who are paid upwards of a million GBP in cold-hard cash in London. Not stock. Cash.

You sound like a bozo, I can sniff it through my screen.

zepolen 6 hours ago | parent [-]

This sounds like a place I want to work at.

YCpedohaven 11 hours ago | parent | prev [-]

[flagged]

raw_anon_1111 11 hours ago | parent [-]

Yes because I didn’t check to see if Claude code used a for loop instead of a while loop? Or that it didn’t use my preferred GOF pattern and didn’t use what I read in “Clean Code”?

Guess what? I also stopped caring how registers are used and counting clock cycles in my assembly language code like it’s the 80s and I’m still programming on a 1Mhz 65C02

icedchai 9 hours ago | parent [-]

I can see the argument both ways. Some code is just not worth looking at...

But do you look at any of the AI output? Or is it just "it works, ship it"?

raw_anon_1111 7 hours ago | parent [-]

My last project was basically an ETL implementation on AWS starting with an empty AWS account and a internal web admin site that had 10 pages. I am yada yada yadaing over a little bit.

What I checked.

1. The bash shell scripts I had it write as my integration test suite

2. To make sure it wasn’t loading the files into Postgres the naive way -loading the file from S3 and doing bulk inserts instead of using the AWS extension that lets it load directly from S3. It’s the differ xe between taking 20 minutes and 20 seconds.

3. I had strict concurrency and failure recovery requirements. I made sure it was done the right way.

4. Various security, logging, log retention requirements

What I didn’t look at - a line of the code for the web admin site. I used AWS Cognito for authentication and checked to make sure that unauthorized users couldn’t use the website. Even that didn’t require looking at the code - I had automated tests that tested all of the endpoints.

icedchai 5 hours ago | parent [-]

This all makes sense.

I've witnessed human developers produce incredibly convoluted, slow "ETL pipelines" that took 10+ minutes to load single digit megabytes of data. It could've been reduced to a shell script that called psql \copy.