Remix.run Logo
boplicity 7 hours ago

> Let's see something ground-breaking

Why? People don't ask hammers to do much more than bash in nails into walls.

AI coding tools can be incredibly powerful -- but shouldn't that power be focused on what the tool is actually good at?

There are many, many times that AI coding tools can and should be used to create a "small program that already exists in multiple forms in the training data."

I do things like this very regularly for my small business. It's allowed me to do things that I simply would not have been able to do previously.

People keep asking AI coding tools to be something other than what they currently are. Sure, that would be cool. But they absolutely have increased my productivity 10x for exactly the type of work they're good at assisting with.

Teknomadix 6 hours ago | parent | next [-]

>People don't ask hammers to do much more than bash in nails into walls.

“It resembles a normal hammer but is outfitted with an little motor and an flexible head part which moves back and forth in a hammering motion, sparing the user from moving his or her own hand to hammer something by their own force and by so making their job easier”

https://gremlins.fandom.com/wiki/Electric_Hammer

pigpop 6 hours ago | parent [-]

Good reference and a funny scene but doesn't quite hit home because we have invented improved hammers in the form of pneumatic nail guns and even cordless nailers (some pneumatic and some motorized) which could truly be called an "electric hammer".

With this context the example may support the quote, nail guns do make driving nails much faster and easier but that's all they do. You can't pull a nail with a nail gun and you can't use it for any of the other things that a regular hammer can do. They do 10x your ability to drive nails though.

On the other hand, LLMs are significantly more multi-purpose than a nail gun.

ncallaway 7 hours ago | parent | prev | next [-]

> People keep asking AI coding tools to be something other than what they currently are.

I think it's for a very reasonable reason: the AI coding tool salespeople are often selling the tools as something other than what they currently are.

I think you're right, that if you calibrate your expectations to what the tools are capable of, there's definitely. It would be nice if the marketing around AI also did the same thing.

onion2k 6 hours ago | parent | next [-]

AI sales seems to be very much aligned with productivity improvement - "do more of the same but faster" or "do the same with fewer people"). No one is selling "do more".

BeetleB 5 hours ago | parent | prev [-]

> I think it's for a very reasonable reason: the AI coding tool salespeople are often selling the tools as something other than what they currently are.

And if this submission was an AI salesperson trying to sell something, the comment/concern would be pertinent. It is otherwise irrelevant here.

BobbyJo 7 hours ago | parent | prev | next [-]

Yes! I can't tell you the number of times I thought to myself "If only there was a way for this problem to be solved once instead of being solved over and over again". If that is the only thing AI is good at, then it's still a big step up for software IMO.

fiyec30375 6 hours ago | parent [-]

It's true. Why should everyone look up the same API docs and make the same mistakes when AI can write it for you instantly and correctly?

oopwhat 2 hours ago | parent [-]

[dead]

blauditore 7 hours ago | parent | prev | next [-]

Because that's the vision of many companies trying to sell AI. Saying that what it can do now is actually already good enough might be true, but it's also moving the goalposts compared to what was promised (or feared, depending who you're asking).

simonw 6 hours ago | parent | next [-]

One of the many important skills needed to navigate our weird new LLM landscape is ignoring what the salespeople say and listening to the non-incentivized practitioners instead.

darkerside 7 hours ago | parent | prev [-]

Can we get specific? What company and salesperson made what claim?

Let's not disregard interesting achievements because they are not something else.

Arisaka1 2 hours ago | parent | prev | next [-]

>Why?

Because I keep wondering myself if AI is here and our output is charged up, then why am I keep seeing more of the same products but with an "AI" sticker slapped on top of them? From a group of technologists like HN and the startup world, that live on the edge of evolution and revolution, maybe my expectations were a bit too high.

All I see is the equivalent of a "look how fast my new car made me go to the super market, when I'm not too demanding on the super market I want to end up with, and all I want is milk and eggs". Which is 100% fine, but at the end of the day I eat the same omelette as always. In this metaphor, I don't feel the slightest behind, or have any sense of FOMO if I cook my omelette slowly. I guess I have more time for my kids if I see the culinary arts as just a job. And it's not like restaurants suddenly get all their tables booked faster just because everyone cooks omelettes faster.

>It's allowed me to do things that I simply would not have been able to do previously.

You're not the one doing them. Me barking orders to John Carmack himself doesn't make me a Quake co-creator, and even if I micromanage his output like the world's most toxic micromanager who knows better I'm still not Carmack.

On top of that, you would have been able to do previously, if you cared enough to upskill to the point where token feeding isn't needed for you to feel productive. Tons of programmers broke barriers, and solved problems that haven't been solved by anyone in their companies before.

I don't see why everyone claiming that they previously couldn't do something is a bragging point. The LLM's that you're using were trained by the Google results you could've gotten if you Google searched.

boplicity 7 hours ago | parent | prev | next [-]

To be clear, I see a lot of "magical thinking" among people who promote AI. They imagine a "perfect" AI tool that can basically do everything better than a human can.

Maybe this is possible. Maybe not.

However, it's a fantasy. Granted, it is a compelling fantasy. But its not one based on reality.

A good example:

"AI will probably be smarter than any single human next year. By 2029, AI is probably smarter than all humans combined.” -- Elon Musk

This is, of course, ridiculous. But, why should we let reality get in the way of a good fantasy?

falcor84 6 hours ago | parent [-]

> AI will probably be smarter than any single human next year.

Arguably that's already so. There's no clear single dimension for "smart"; even within exact sciences, I wouldn't know how to judge e.g. "Who was smarter, Einstein or Von Neumann?". But for any particular "smarts competition", especially if it's time limited, I'd expect Claude 4.5 Opus and Gemini 3 Pro to get higher scores than any single human.

darkwater 42 minutes ago | parent [-]

So we are back to the original comment that generated this thread: why hasn't AI generated a new and better compression algorithm, for example?

spzb 7 hours ago | parent | prev [-]

> Why? People don't ask hammers to do much more than bash in nails into walls.

No one is propping up a multi-billion dollar tech bubble by promising hammers that do more than bash nails. As a point of comparison that makes no sense.

pigpop 6 hours ago | parent | next [-]

The software development market is measured in tens of billions to hundreds of billions of dollars depending on which parts you're looking at so inventing a better hammer (development tool) can be expected to drive billions of dollars of value. How many billions depends on how good of a tool it turns out to be in the end. That's only counting software, it's also directly applicable to all media (image, video, audio, text) and some scientific domains (genetics, medicine, materials, etc.)

falcor84 6 hours ago | parent | prev [-]

That's nitpicking; in this manner you can dismiss any analogy, by finding an aspect on which it's different from the original comparandum.