Remix.run Logo
falcor84 a day ago

But this is how disruptive innovation works. I recall that even around 2005, after digital camera sales overtook the sales of film cameras, people were still asking "If digital is so good, why aren't the professional photographers using them?" and concluding that digital photography is just a toy that will never really replace print.

vouwfietsman a day ago | parent | next [-]

This is not really the same level of argument. The post is arguing against the idea that software is incredibly cheap to make through AI right now, not that AI cannot ever make complete software products from scratch in the future.

falcor84 a day ago | parent [-]

Ok, well, rereading TFA again, it does seem to say that, but I took it as hyperbole. I'm not familiar with any of even the staunchest GenAI visionaries claiming:

> Why buy a CRM solution or a ERM system when “AI” can generate one for you in hours or even minutes?

Obviously that's a strawman argument that shouldn't be taken at face-value. AI-generated software is rapidly improving, but it will take some time until it can do that sort of work without human intervention. Extrapolating from METR's chart[0], we should expect a SotA AI to one-shot a modern commercial CRM in around the early 2030s. It's then up to anyone here to decide if that's something we should actively prepare for already (I personally think we should).

[0] https://metr.org/blog/2025-03-19-measuring-ai-ability-to-com...

callc a day ago | parent | prev | next [-]

But what?

Give some concrete examples of why current LLM/AI is disruptive technology like digital cameras.

That’s the whole point of the article. Show the obvious gains.

JW_00000 a day ago | parent [-]

falcor's point is that we will see this in 5 to 10 years.

falcor84 a day ago | parent | next [-]

Exactly. I'm arguing that what we should be focused on at this relatively early stage is not the amount of output but the rate of innovation.

It's important to note that we're now arguing about the level of quality of something that was a "ha, ha, interesting" in a sidenote by Andrej Karpathy 10 years ago [0], and then became a "ha, ha, useful for weekend projects" in his tweet from a year ago. I'm looking forward to reading what he'll be saying in the next few years.

[0] https://karpathy.github.io/2015/05/21/rnn-effectiveness/

[1] https://x.com/karpathy/status/1886192184808149383?s=20

callc a day ago | parent | prev | next [-]

Why so long?

If AI had such obvious gains, why not accelerate that timeline to 6 months?

Take the average time to make a simple app, divide by the supposed productivity speed up, and this should be the time we see a wave of AI coded apps.

As time goes on, the only conclusion we can reach (especially looking at the data) is that the productivity gains are not substantial.

amelius a day ago | parent [-]

> Why so long?

Because in the beginning of a new technology, the advantages of the technology benefit only the direct users of the technology (the programmers in this case).

However, after a while, the corporations see the benefit and will force their employees into an efficiency battle, until the benefit has shifted mostly away from the employees and towards their bosses.

After this efficiency battle, the benefits will become observable from a macro perspective.

spit2wind a day ago | parent | prev | next [-]

GPT3 was released in May 2020. Its been nearly 5 years.

lukeschlather a day ago | parent | next [-]

The first digital camera was released in around 1975? Digital cameras overtook film camera sales in 2005, 30 years later.

falcor84 a day ago | parent | prev [-]

Why is gpt3 relevant? I can't recall anyone using gpt3 directly to generate code. The closest would probably be Tabnine's autocompletion, which I think first used gpt2, but I can't recall any robust generation of full functions (let alone programs) before late 2022 with the original GitHub copilot.

amelius a day ago | parent | prev [-]

This gives me hope that we will finally see some competition to the Android/iOS duopoly.

KurSix 12 hours ago | parent | prev | next [-]

The digital camera analogy is flawed. Digital sensors had a clear and measurable path to improvement: megapixels, ISO, dynamic range. LLMs have no such clear path to 'understanding' and 'reliability'. It's entirely possible we've hit a fundamental ceiling of their capabilities, not that we're just in an early stage

rsynnott a day ago | parent | prev | next [-]

That feels a bit different. By 2005 it was obvious that digital cameras would, at some point in the future, be good enough to replace most high-end film camera use, unless Moore’s Law went out the window entirely. So it was highly likely that digital cameras would take over. There is no inevitability to llm coding tools hitting that ‘good enough’ state.

And they’re not even really talking about the future. People are making extremely expansive claims about how amazing llm coding tools are _right now_. If these claims were correct, one would expect to see it in the market.

exasperaited a day ago | parent | prev | next [-]

It is an aside, but: I am not sure I encountered any professional photographers saying that in 2005, FWIW; only non-serious photographers were still prattling on about e.g. the mystical and conveniently malleable "theoretical" resolution of film being something that would prevent them ever switching.

There were still valid practical and technical objections for many (indeed, there still is at least one technical objection against digital), the philosophical objections are still as valid as they were (and if you ask me digital has not come close to delivering on its promise to be less environmentally harmful).

But every working press photographer knew they would switch when there were full-frame sensors that were in range of budget planning that shot without quality compromise at the ISO speed they needed or when the organisations they worked for completed their own digital transition. Every working fashion photographer knew that viable cameras already existed.

ETA: Did it disrupt the wider industry? Obviously. Devastatingly. For photographers? It lowered the barrier to entry and the amount they could charge. But any working photographer had encountered that at least once (autofocus SLRs did the same thing, minilabs did the same thing, E6 did it, etc. etc.) and in many ways it was a simple enabling technology because their workflows were also shifting towards digital so it was just the arrival of a DDD workflow at some level.

Putting aside that aside, I am really not convinced your comparison isn't a category error, but it is definitely an interesting one for a couple of reasons I need to think about for a lot longer.

Not least that digital photography triggered a wave of early retirements and career switches, that I think the same thing is coming in the IT industry, and that I think those retirements will be much more damaging. AI has so radically toxified the industry that it is beginning to drive people with experience and a decade or more of working life away. I consider my own tech retirement to have already happened (I am a freelancer and I am still working, but I have psychologically retired, and very early; I plan to live out my working life somewhere else, and help people resisting AI to continue to resist it)

newsoftheday a day ago | parent [-]

> it is beginning to drive people with experience and a decade or more of working life away.

I was planning to work until mid 60's FT but retired this year because of, as you put it, AI toxification.

ori_b a day ago | parent | prev | next [-]

It's asking "If digital is so good, why aren't there more photos?"

falcor84 a day ago | parent [-]

Exactly, let's take this analogy.

TFA is only looking at releases on app stores (rather than eg the number of github repos, which has been growing a lot). The analog would be of the number of photos being published around 2025, which I believe had been pretty steady. It's only with the release of smart phones and facebook a few years afterwards that we started seeing a massive uptick in the overall number of photos out there.

binary132 a day ago | parent | prev [-]

It’s actually comical watching the AI shills trot out the same points in every argument about the utility of LLMs. Now you’re supposed to say that after 10 years of digital, the only people sticking with film were the “curmudgeons”.

I for one hail the curmudgeons. Uphold curmudgeon thought.