Remix.run Logo
crazygringo 2 days ago

> What are you talking about? The return on investment from computers was immediate and extremely identifiable.

It is well-documented, and called the "productivity paradox of computers" if you want to look it up. It was identified in 1987, and economic statistics show that personal computing didn't become a net positive for the economy until around 1995-1997.

And like I said, it's very dependent on the individual company. But consider how many businesses bought computers and didn't use them productively. Where it was a net loss because the computers were expensive and the software was expensive and the efficiency gained wasn't worth the cost -- or worse, they weren't a good match and efficiency actually dropped. Think of how many expensive attempted migrations from paper processes to early databases failed completely.

PhantomHour 2 days ago | parent [-]

It's well documented. It's also quite controversial and economists still dispute it to this day.

It's economic analysis of the entire economy, from the "outside" (statistics) inward. My point is that the individual business case was financially solvent.

Apple Computer did not need to "change the world" it needed to sell computers at a profit, enough of them to cover their fixed costs, and do so without relying on other people just setting their money on fire. (And it succeeded on all three counts.) Whether or not they were a minute addition to the entire economy or a gigantic one is irrelevant.

Similarly with AI. AI does not need to "increase aggregate productivity over the entire economy", it needs to turn a profit or it dies. Whether or not it can keep the boomer pension funds from going insolvent is a question for economics wonks. Ultimately the aggregate economic effects follow from the individual one.

Thus the difference. PCs had a "core of financial solvency" nearly immediately. Even if they weren't useful for 99.9% of jobs that 0.1% would still find them useful enough to buy and keep the industry alive. If the hype were to run out on such an industry, it shrinks to something sustainable. (Compare: Consumer goods like smartwatches, which were hyped for a while, and didn't change the world but maintained a suitable core audience to sustain the industry)

With AI, even AI companies struggle to pitch such a core, nevermind actually prove it.

crazygringo 2 days ago | parent [-]

The productivity paradox isn't disputed by any mainstream economists. What is debated is its exact timing, size, and exactly which parts of businesses are most responsible (i.e. was eventual growth mostly about computers improving existing processes, or computers enabling brand-new processes like just-in-time supply chains)? The underlying concept is generally considered sound and uncontroversial.

I don't really understand what point you're trying to make. It seems like you're complaining that CapEx costs are higher in GenAI than they were in personal computing? But lots of industries have high CapEx. That's what investors are for.

The only point I've made is that "95% of organizations are getting zero return" is to be expected in the early days of a new technology, and that the personal computer is a reasonable analogy here. The subject here is companies that use the tech, not companies creating the tech. The investment model behind the core tech has nothing to do with the profitability of companies trying to use it or build on it. The point is that it takes a lot of time and trial and error to figure out how to use a new tech profitably, and we are currently in very early days of GenAI.