Remix.run Logo
andrewmutz 5 days ago

Reading smart software people talk about AI in 2025 is basically just reading variations on the lump of labor fallacy.

If you want to understand what AI can do, listen to computer scientists. If you want to understand it’s likely impact on society, listen to economists.

victorbjorklund 5 days ago | parent | next [-]

100%. Just because someone understands how a NN works does not mean they understand the impact it has on the economy, society, etc.

They could of course be right. But they don't have any more insight than any other average smart person does.

DrewADesign 5 days ago | parent [-]

The “I think I understand a field because I think I understand the software for that field,” thing is a perennial problem in the tech world.

exasperaited 4 days ago | parent [-]

Indeed it is -- it's perhaps the central way developers offend their customers, let alone misunderstand them.

One problem is it is met from the other side by customers who think they understand software but don't actually have the training to visualise the consequences of design choices in real life.

Good software does require cross-domain knowledge that goes beyond "what existing apps in the market do".

I have in the last few years implemented a bit of software where a requirement had been set by a previous failed contractor and I had to say, look, I appreciate this requirement is written down and signed off, but my mother worked in your field for decades, I know what kind of workload she had, what made it exhausting, and I absolutely know that she would have been so freaking furious at the busywork this implementation will create: it should never have got this far.

So I had to step outside the specification, write the better functionality to prove my point, and I don't think realistically I was ever compensated for it, except metaphysically: fewer people out there are viscerally imagining inflicting harm on me as a psychological release.

mmmore 4 days ago | parent | prev | next [-]

Here's a thoughtful post related to your lump of labor point: https://www.lesswrong.com/posts/TkWCKzWjcbfGzdNK5/applying-t...

What economists have taken seriously the premise that AI will be able to do any job a human can more efficiently and fully thought through it's implications? i.e. a society where (human) labor is unnecessary to create goods/provide services and only capital and natural resources are required. The capabilities that some computer scientists think AI will soon have would imply that. The ones that have seriously considered it that I know are Hanson and Cowen; it definitely feels understudied.

amanaplanacanal 4 days ago | parent [-]

If it is decades or centuries off, is it really understudied? LLMs are so far from "AI will be able to do any job a human can more efficiently and fully" that we aren't even in the same galaxy.

mmmore 4 days ago | parent | next [-]

If AI that can fully replace humans is 25 years off, preparing society for its impacts is still one of the most important things to ensure that my children (which I have not had yet) live a prosperous and fulfilling life. The only other things of possibly similar import are preventing WWIII, and preventing a pandemic worse than COVID.

I don't see how AGI could be centuries off (at least without some major disruption to global society). If computers that can talk, write essays, solve math problems, and code are not a warning sign that we should be ready, then what is?

ori_b 4 days ago | parent | prev [-]

Decades isn't a long time.

ACCount37 4 days ago | parent | prev | next [-]

How does "lump of labor fallacy" fare when there is no job remaining that a human can do better or cheaper than a machine?

The list of advantages human labor hold over machines is both finite and rapidly diminishing.

marstall 4 days ago | parent [-]

> no job remaining that a human can do better or cheaper than a machine this is the lump of labor fallacy. jobs machines do produce commodities. commodities don't have much value. humans crave value - its a core component of our psyche. therefore new things will be desired, expensive things ... and only humans can create expensive things, since robots dont get salaries

nibnalin 5 days ago | parent | prev | next [-]

What or whose writing or podcasts would you recommend reading / listening?

snapey 5 days ago | parent [-]

Tyler Cowen has a lot of interesting things to say on the impact of AI on the economy. His recent talk at DeepMind is a good place to start https://www.aipolicyperspectives.com/p/a-discussion-with-tyl...

silveraxe93 4 days ago | parent | prev [-]

The title - "AI is different" - and this line:

""" Yet the economic markets are reacting as if they were governed by stochastic parrots. Their pattern matching wants that previous technologies booms created more business opportunities, so investors are polarized to think the same will happen with AI. """

Are a direct argument against your point.

If people were completely unaware of the lump of labor fallacy, I'd understand you comment. It would be adding extra information into the conversation. But this is not it. The "lump of labor fallacy" is not a physical law. If someone is literally arguing that it doesn't apply in this case, you can't just parrot it back and leave. That's not a counter argument.