Remix.run Logo
lisbbb 7 days ago

Everything I have worked on as a fullstack developer for multiple large companies over the past 25 years tells me that AI isn't just going to replace a bunch of workers. The complexity of those places is crazy and it takes teamwork to keep them running. Just look what happens internally over a long holiday weekend at most big companies, they are often just barely meeting their uptime guarantees.

I was recently at a big, three-letter pharmacy company and I can't be specific, but just let me say this: They're always on the edge of having the main websites going down for this or that reason. It's a constant battle.

How is adding more AI complexity going to help any of that when they don't even have a competent enough workforce to manage the complexity as it is today?

You mention VR--that's another huge flop. I got my son a VR headset for Christmas in like 2022. It was cool, but he couldn't use it long or he got nauseaus. I was like "okay, this is problematic." I really liked it in some ways, but sitting around with that goofy thing on your head wasn't a strong selling point at all. It just wasn't.

If AI can't start doing things with accuracy and cleverness, then it's not useful.

cheevly 7 days ago | parent | next [-]

You have it so backwards. The complexity of those places is exactly why AI will replace it.

cheema33 7 days ago | parent | prev | next [-]

> If AI can't start doing things with accuracy and cleverness, then it's not useful.

Humans are not always accurate or clever. But we still consider them useful and employ them.

827a 7 days ago | parent | prev [-]

So, to give a tactile example that helped me recently: We have a frontend web application that was having some issues with a specific feature. This feature makes a complex chain of a maybe dozen API requests when a resource is created, conditionally based on certain things, and there's a similar process that happens when editing this resource. But, there was a difference in behavior between the creating and editing routes, when a user expected that the behavior would be the same.

This is crusty, horrible, old, complex code. Nothing is in one place. The entire editing experience was copy-pasted from the create resource experience (not even reusable components; literally copy-pasted). As the principal on the team, with the best understanding of anyone about it, even my understanding was basically just "yeah I think these ten or so things should happen in both cases because that's how the last guy explained it to me and it vibes with how I've seen it behave when I use it".

I asked Cursor (Opus Max) something along the lines of: Compare and contrast the differences in how the application behaves when creating this resource versus updating it. Focus on the API calls its making. It responded in short order with a great summary, and without really being specifically prompted to generate this insight it ended the message by saying: It looks like editing this resource doesn't make the API call to send a notification to affected users, even though the text on the page suggests that it should and it does when creating the resource.

I suspect I could have just said "fix it" and it could have handled it. But, as with anything, as you say: Its more complicated than that. Because while we imply we want the app to do this, its a human's job (not the AI's) to read into what's happening here: The user was confused because they expected the app to do this, but do they actually want the app to do this? Or were they just confused because text on the page (which was probably just copy-pasted from the create resource flow) implied that it would?

So instead I say: Summarize this finding into a couple sentences I can send to the affected customer to get his take on it. Well, that's bread and butter for even AIs three years ago right there, so off it goes. The current behavior is correct; we just need to update the language to manage expectations better. AI could also do that, but its faster for me to just click the hyperlink in Claude's output, jumps right to the file, and I make the update.

Opus Max is expensive. According to Cursor's dashboard, this back-and-forth cost ~$1.50. But let's say it would have taken me just an hour to arrive at the same insight it did (in a fifth the time): that's easily over $100. That's a net win for the business, and its a net win for me because I now understand the code better than I did before, and I was able to focus my time on the components of the problem that humans are good at.