Remix.run Logo
crazygringo 3 days ago

This is no different from the personal computer, and it is to be expected.

The initial years of adopting new tech have no net return because it's investment. The money saved is offset by the cost of setting up the new tech.

But then once the processes all get integrated and the cost of buying and building all the tech gets paid off, it turns into profit.

Also, some companies adopt new tech better than others. Some do it badly and go out of business. Some do it well and become a new market leader. Some show a net return much earlier than others because they're smarter about it.

No "oof" at all. This is how investing in new transformative business processes works.

TimTheTinker 3 days ago | parent | next [-]

> transformative business processes

Many new ideas came through promising to be "transformative" but never reached anywhere near the impact that people initially expected. Some examples: SOA, low-code/no-code, blockchain for anything other than cryptocurrency, IoT, NoSQL, the Semantic Web. Each of these has had some impact, but they've all plateaued, and there are very good reasons (including the results cited in TA) to think GenAI has also plateaued.

My bet: although GenAI has plateaued, new variants will appear that integrate or are inspired by "old AI" ideas[0] paired with modern genAI tech, and these will bring us significantly more intelligent AI systems.

[0] a few examples of "old AI": expert systems, genetic algorithms, constraint solving, theorem proving, S-expression manipulation.

MonkeyClub 2 days ago | parent [-]

> [...] S-expression manipulation.

Can't wait for Lisp to be the language of the future again.

Some of my friends reckon it'll happen the year after the year of Linux on the desktop. They're on Windows 11, though, so I don't know how to read that.

delusional 3 days ago | parent | prev | next [-]

The document actually debunks this take:

> GenAI has been embedded in support, content creation, and analytics use cases, but few industries show the deep structural shifts associated with past general-purpose technologies such as new market leaders, disrupted business models, or measurable changes in customer behavior.

They are not seeing the structural "disruptions" that were present for previous technological shifts.

signatoremo 2 days ago | parent [-]

Changes over which time window? AI projects in enterprises can’t be longer than 2 years, which is practically in testing the water phase, of course there are very few projects of the disruption nature exist yet.

PhantomHour 3 days ago | parent | prev | next [-]

> This is no different from the personal computer, and it is to be expected.

What are you talking about? The return on investment from computers was immediate and extremely identifiable. For crying out loud "computers" are literally named after the people whose work they automated.

With Personal Computers the pitch is similarly immediate. It's trivial to point at what labour VisiCalc automated & improved. The gains are easy to measure and for every individual feature you can explain what it's useful for.

You can see where this falls apart in the Dotcom Bubble. There are very clear pitches; "Catalogue store but over the internet instead of a phone" has immediately identifiable improvements (Not needing to ship out catalogues, being able to update it quickly, not needing humans to answer the phones)

But the hype and failed infrastructure buildout? Sure, Cisco could give you an answer if you asked them what all the internet buildout was good for. Not a concrete one with specific revenue streams attached, and we all know how that ends.

The difference between Pets.com and Amazon is almost laughably poignant here. Both ultimately attempts to make the "catalogue store but on the computer" work, but Amazon focussed on broad inventory and UX. They had losses, but managed to contain them and became profitable quickly (Q4 2001). Amazon's losses shrank as revenue grew.

Pets.com's selling point was selling you stuff below cost. Good for growth, certainly, but this also means that their losses grew with their growth. The pitch is clearly and inherently flawed. "How are you going to turn profitable?" We'll shift into selling less expensive goods "How are you going to do that?" Uhhh.....

...

The observant will note: This is the exact same operating model of the large AI companies. ChatGPT is sold below unit cost. Claude is sold below unit cost. Copilot is sold below unit cost.

What's the business pitch here? Even OpenAI struggles to explain what ChatGPT is actually useful for. Code assistants are the big concrete pitch and even those crack at the edges as research after research shows the benefits appear to be psychosomatic. Even if Moore's law hangs on long enough to bring inference cost down (nevermind per-task token usage skyrocketing so even that appears moot), what's the pitch. Who's going to pay for this?

Who's going to pay for a Personal Computer? Your accountant.

bpt3 2 days ago | parent | next [-]

The contortions people will go through to defend a technology or concept they like blows my mind. Irrational exuberance is one thing, but denial of history in order to lower the bar for the next big thing really irritates me for some reason.

Computing was revolutionary, both at enterprise and personal scale (separately). I would say smartphones were revolutionary. The internet was revolutionary, though it did take a while to get going at scale.

Blockchain was not revolutionary.

I think LLM-based AI is trending towards blockchain, not general purpose computing. In order for it to be revolutionary, it needs to objectively and quantifiably add value to the lives (professionally or personally) of a significant piece of the population. I don't see how that happens with LLMs. They aren't reliable enough and don't seem to have any path towards reasoning or understanding.

crazygringo 2 days ago | parent | prev | next [-]

> What are you talking about? The return on investment from computers was immediate and extremely identifiable.

It is well-documented, and called the "productivity paradox of computers" if you want to look it up. It was identified in 1987, and economic statistics show that personal computing didn't become a net positive for the economy until around 1995-1997.

And like I said, it's very dependent on the individual company. But consider how many businesses bought computers and didn't use them productively. Where it was a net loss because the computers were expensive and the software was expensive and the efficiency gained wasn't worth the cost -- or worse, they weren't a good match and efficiency actually dropped. Think of how many expensive attempted migrations from paper processes to early databases failed completely.

PhantomHour 2 days ago | parent [-]

It's well documented. It's also quite controversial and economists still dispute it to this day.

It's economic analysis of the entire economy, from the "outside" (statistics) inward. My point is that the individual business case was financially solvent.

Apple Computer did not need to "change the world" it needed to sell computers at a profit, enough of them to cover their fixed costs, and do so without relying on other people just setting their money on fire. (And it succeeded on all three counts.) Whether or not they were a minute addition to the entire economy or a gigantic one is irrelevant.

Similarly with AI. AI does not need to "increase aggregate productivity over the entire economy", it needs to turn a profit or it dies. Whether or not it can keep the boomer pension funds from going insolvent is a question for economics wonks. Ultimately the aggregate economic effects follow from the individual one.

Thus the difference. PCs had a "core of financial solvency" nearly immediately. Even if they weren't useful for 99.9% of jobs that 0.1% would still find them useful enough to buy and keep the industry alive. If the hype were to run out on such an industry, it shrinks to something sustainable. (Compare: Consumer goods like smartwatches, which were hyped for a while, and didn't change the world but maintained a suitable core audience to sustain the industry)

With AI, even AI companies struggle to pitch such a core, nevermind actually prove it.

crazygringo 2 days ago | parent [-]

The productivity paradox isn't disputed by any mainstream economists. What is debated is its exact timing, size, and exactly which parts of businesses are most responsible (i.e. was eventual growth mostly about computers improving existing processes, or computers enabling brand-new processes like just-in-time supply chains)? The underlying concept is generally considered sound and uncontroversial.

I don't really understand what point you're trying to make. It seems like you're complaining that CapEx costs are higher in GenAI than they were in personal computing? But lots of industries have high CapEx. That's what investors are for.

The only point I've made is that "95% of organizations are getting zero return" is to be expected in the early days of a new technology, and that the personal computer is a reasonable analogy here. The subject here is companies that use the tech, not companies creating the tech. The investment model behind the core tech has nothing to do with the profitability of companies trying to use it or build on it. The point is that it takes a lot of time and trial and error to figure out how to use a new tech profitably, and we are currently in very early days of GenAI.

simianwords 3 days ago | parent | prev [-]

I highly doubt that the return in investment was seen immediately for personal computers. Do you have any evidence? Can you show me a company that adopted personal computers and immediately increased its profits? I’ll change my mind.

Jensson 2 days ago | parent | next [-]

I know people who bought a computer and automated away a massive amount of work and thus paid it back in a single day in the 70s.

Back then companies needed a massive amount of people to sit and do all the calculations to do their accounting, but a single person using a computer could do the same work in a day. This was so easy and efficient that almost every bigger company started buying computers at the time.

You don't need to automate away accountants, you just need to automate away the many thousands of calculations needed to complete the accounting to save a massive amount of money. It wasn't hard to convince people to use a computer instead of sitting for weeks manually calculating sums on sheets.

PhantomHour 2 days ago | parent | prev [-]

I'm sorry but you're asking me here to dig up decades old data to justify my claim that "The spreadsheet software has an immediately identifiable ROI".

I am not going to do that. If you won't take it at my word that "computer doing a worksheet's of calculations automatically" is faster & less error-prone than "a human [with electronic calculator] doing that by hand", then that's a you problem.

An apple II cost $1300. VisiCalc cost $200. An accountant in that time would've cost ~10x that annually and would either spend quite a bit more than 10% doing the rote work, or hire dedicated people for it.

simianwords 2 days ago | parent [-]

>If you won't take it at my word that "computer doing a worksheet's of calculations automatically" is faster & less error-prone than "a human [with electronic calculator] doing that by hand", then that's a you problem.

Reality is complicated and messy. There are many hurdles to overcome, many people to convince and many logistics to handle. You can't just replace accountants with computers - it takes time. You can understand why I find it hard to believe that a huge jump like the one with software can take time as well.

datavirtue 3 days ago | parent | prev [-]

I think it just turns into table stakes.