Remix.run Logo
bluefirebrand 5 hours ago

> if you asked for the right thing while being aware of context limitations

So, still pretty likely to produce slop in a large majority of cases

If the most useful place for them is where you've already specced things out to that degree of precision then they aren't that useful?

Speccing things to that precision is the time consuming and difficult work anyways, after all.

georgemcbay 5 hours ago | parent [-]

I think LLMs currently need to be used by someone who knows what they are doing to produce value, but the jump they made from being endless slop machines to useful tools in the right hands is enough for me to assume it is only a matter of time until they will be useful tools in the hands of even the untrained masses.

I wish this wasn't true because I think it will economically upend the industry in which I have a career, but sadly the universe doesn't care what I wish.

mjr00 5 hours ago | parent | next [-]

> assume it is only a matter of time until they will be useful tools in the hands of even the untrained masses.

IMO this vastly overestimates how good the "untrained masses" are at thinking in a logical, mathematical way. Apparently something as basic as Calculus II has a fail rate of ~50% in most universities.

chromacity 3 hours ago | parent | next [-]

How does this follow?

There's nothing "basic" about Calculus II. Calculus is uniquely cursed in mathematical education because everything that comes before it is more or less rooted in intuition about the real world, while calculus is built on axioms that are far more abstract and not substantiated well (not until later in your mathematical education). I expect many intelligent, resourceful people to fail it and I think it says more about the abstractions we're teaching than anything else.

But also, prompting LLMs to give good results is nowhere near as complex as calculus.

isueej 4 hours ago | parent | prev | next [-]

That’s why you can’t generalise opinions on here.

Most people on here don’t belong to that group of people. So ofc they can find a way to create value out of a thing that requires some tinkering and playing with.

The question is can the techniques evolve to become technologies to produce stuff with minimal effort - whilst only knowing the bare minimum. I’m not convinced personally - it’s a pipe dream and overlooks the innate skill necessary to produce stuff.

xyzelement 4 hours ago | parent | prev [-]

Who cares? People know what they want and need and AI is increasingly able to take it from there.

embedding-shape 4 hours ago | parent | next [-]

> People know what they want and need

If they truly did, there wouldn't be a huge amount of humans whose role is basically "Take what users/executives say they want, and figure out what they REALLY want, then write that down for others".

Maybe I've worked for too many startups, and only consulted for larger companies, but everywhere in businesses I see so many problems that are basically "Others misunderstood what that person meant" and/or "Someone thought they wanted X, they actually wanted Y".

mjr00 4 hours ago | parent | prev | next [-]

> People know what they want and need

The multi-decade existence of roles like "business analysts" and "product owners" (and sometimes "customer success") is pretty strong evidence that this is not the case.

PhilipRoman 4 hours ago | parent | prev | next [-]

What they want? Sometimes. What they need? Almost never.

isueej 4 hours ago | parent | prev [-]

Right… people knew they wanted an iPhone before it was conceived, right? Lmao

The arrogance of people like you is astonishing.

bluefirebrand 4 hours ago | parent | prev [-]

> I wish this wasn't true because I think it will economically upend the industry in which I have a career, but sadly the universe doesn't care what I wish.

I mean, yes. I'm worried about my career too, but for different reasons. I don't think these things are actually good enough to replace me, but I do think it doesn't matter to the people signing the cheques.

I don't believe LLMs are producing anything better than slop. I think people's standards have been sinking for a long time and will continue to sink until they reach the level LLMs produce

The problem isn't just LLMs and the fact they produce slop, it's that people are overall pretty fine with slop

I'm not though, so there's no place for me in most software business anymore

isueej 3 hours ago | parent [-]

I’m not a SWE.

But I look at software from the perspective of them as being objects.

Since it’s intangible people can’t see within. So something can look pretty even if underlying it all, it’s slop.

However, there is an implicit trade off - mounting slop makes you more vulnerable from a security standpoint, bugs etc which can destroy trust and experience of using the software. This can essentially put the life of a business at risk.

People aren’t thinking so much about that risk - because it hasn’t happened to anyone large substantially. What I think about is will slop just continue to mount unchecked? Or are people expecting there to be improvements that enable oneself to go back and clean up the slop with more powerful tooling?

If the latter does not come about, I think we will see more firms come under stress.

Overall though, I think too much focus is on the acceleration of output. I never think that’s the most important thing. It’s secondary to having a crystal clear vision. The problem is to have a clear vision requires doing a lot of grunt work - it trains and conditions your mind to think a particular kind of way.

It will be interesting to see how this all plays out.