Remix.run Logo
joseangel_sc 10 hours ago

except the thing does not work as expected and it just makes you worse not better

keyle 9 hours ago | parent | next [-]

Like I said that's temporary. It's janky and wonky but it's a stepping stone.

Just look at image generation. Actually factually look at it. We went from horror colours vomit with eyes all over, to 6 fingers humans, to pretty darn good now.

It's only time.

leecommamichael 9 hours ago | parent | next [-]

Why is image generation the same as code generation?

dcw303 9 hours ago | parent | next [-]

it's not. We were able to get rid of 6 fingered hands by getting very specific, and fine tuning models with lots of hand and finger training data.

But that approach doesn't work with code, or with reasoning in general, because you would need to exponentially fine tune everything in the universe. The illusion that the AI "understands" what it is doing is lost.

rvz 7 hours ago | parent | prev [-]

It isn't.

Code generation progression in LLMs still carries higher objective risk of failure depending on the experience on the person using it because:

1. They still do not trust if the code works (even if it has tests) thus, needs thorough human supervision and still requires on-going maintainance.

2. Hence (2) it can cost you more money than the tokens you spent building it in the first place when it goes horribly wrong in production.

Image generation progression comes with close to no operational impact, and has far less human supervision and can be safely done with none.

mr_freeman 2 hours ago | parent | prev [-]

> Just look at image generation. Actually factually look at it. We went from horror colours vomit with eyes all over, to 6 fingers humans, to pretty darn good now.

Yes, but you’re not taking into account what actually caused this evolution. At first glance, it looks like exponential growth, but then we see OpenAI (as one example) with trillions in obligations compared to 12–13 billion in annual revenue. Meanwhile, tool prices keep rising, hardware demand is surging (RAM shortages, GPUs), and yet new and interesting models continue to appear. I’ve been experimenting with Claude over the past few days myself. Still, at some point, something is bound to backfire.

The AI "bubble" is real, you don’t need a masters degree in economics to recognize it. But with mounting economic pressures worldwide and escalating geopolitical tension we may end up stuck with nothing more than those amusing Will Smith eating pasta videos for a while.

10 hours ago | parent | prev | next [-]
[deleted]
beebmam 9 hours ago | parent | prev | next [-]

Comments like these are why I don't browse HN nearly ever anymore

w4yai 8 hours ago | parent [-]

Nothing new. Whenever a new layer of abstraction is added, people say it's worse and will never be as good as the old way. Though it's a totally biased opinion, we just have issues with giving up things we like as human being.

roadbuster 7 hours ago | parent | next [-]

> Whenever a new layer of abstraction is added

LLMs aren't a "layer of abstraction."

99% of people writing in assembly don't have to drop down into manual cobbling of machine code. People who write in C rarely drop into assembly. Java developers typically treat the JVM as "the computer." In the OSI network stack, developers writing at level 7 (application layer) almost never drop to level 5 (session layer), and virtually no one even bothers to understand the magic at layers 1 & 2. These all represent successful, effective abstractions for developers.

In contrast, unless you believe 99% of "software development" is about to be replaced with "vibe coding", it's off the mark to describe LLMs as a new layer of abstraction.

w4yai 5 hours ago | parent [-]

> unless you believe 99% of "software development" is about to be replaced with "vibe coding"

Probably not vibe coding, but most certainly with some AI automation

duskdozer 6 hours ago | parent | prev | next [-]

The difference is that LLM output is very nondeterministic.

w4yai 5 hours ago | parent [-]

It depends. Temperature is a variable. If you really need determinism, you could build a LLM for that. Non-determinism can be a good feature though.

duskdozer 2 hours ago | parent [-]

How would you do that? If it's possible, it seems strange that someone hasn't done it already.

8 hours ago | parent | prev [-]
[deleted]
9 hours ago | parent | prev | next [-]
[deleted]
CrimsonRain 7 hours ago | parent | prev [-]

That's your opinion and you can not use those tools.

People are paying for it because it helps them. Who are you to whine about it?

nunez 7 hours ago | parent [-]

But that's the entire flippin' problem. People are being forced to use these tools professionally at a stagering rate. It's like the industry is in its "training your replacement" era.

CrimsonRain 5 hours ago | parent | next [-]

you don't like it? Find a place that doesn't enforce it. Can't find it? Then either build it or accept that you want a horse carriage while people want taxi.

mr_freeman 2 hours ago | parent | prev [-]

That's Capitalism, baby