Remix.run Logo
bgwalter 6 days ago

The paperclip maximizer story, where an AGI directs all economic resources towards manufacturing paperclips, was wrong. We have the the graphics card maximizers, where humans voluntarily redirect vast economic resources towards generating a maximal stream of tokens that hardly anyone wants. Or perhaps call them token maximizers.

There used to be the notion that "talk is cheap". Now we are spending $trillions on generating idle talk.

gizajob 6 days ago | parent | next [-]

I always found this a bit of an omission in Bostrom’s argument in Superintelligence - on one hand he’d say we should avoid giving AI a goal like paperclip maximisation while overlooking that the method of transistor maximisation used to attempt to build the Superintelligence was the same kind of goal. Same goes for whatever system he fantastically proposed to run his fanciful “mind uploading” Cartesian scenario.

dwood_dev 6 days ago | parent [-]

The difference is the demand for paperclips is already met. Each additional paperclip is almost pure waste of resources.

The demand for silicon, even outside AI is not, and advances in silicon are going to benefit pretty much every industry in the long run, even if during the short term they are distorted by the AI demand.

gizajob 5 days ago | parent [-]

Well we have 8 billion non-artificial general intelligences walking around on earth yet we’re dedicating enormous resources trying to electronically replace them. This is in the name of tech companies stealing value rather than spreading it out more evenly.

ksec 2 days ago | parent | prev | next [-]

> that hardly anyone wants.

People continues to mis-weight the benefits of AGI / LLM or in the case "graphics card maximizers". It is precisely because of this, we can continue to extend semi-conductor improvements for at least another 5 years, brings in 1200W CPU in server and liquid cooling along with new DC and Rack design 5 years earlier. Partly continues to fund Gaming GPU where the industry itself were not able to sustain it. New improvement to Networking Gear due to heavy use of LLM. HBM profits that drives current EUV DRAM innovations.

As far as I can tell, I see very little down size to all these investments where Big Tech would have spent those money on stock buy backs or dividends.

And this is assuming AGI really is a bubble and brings little to no productivity, and exclude all the improvements that is adjacent to it.

creddit 6 days ago | parent | prev | next [-]

> ...that hardly anyone wants.

Meanwhile, in reality, ChatGPT is the fastest growing consumer product ever and LLM provider revenues are like superexponential.

no_wizard 6 days ago | parent [-]

Threads broke the record within 5 days of launching, surpassing ChatGPT adoption, has less pressure from competitors, and Threads very likely makes money for Meta.

On the other hand, OpenAI’s ChatGPT has more intense pressure from competitors, isn’t making any money, and costs are still rising for its operation.

I don’t know where you’re getting the idea that it’s the “fastest growing consumer product ever” and “revenues are like super exponential” as it’s demonstrated repeatedly that OpenAI has yet to turn a profit or even meaningfully dent their burn rate

creddit 5 days ago | parent [-]

Threads did grow very fast at launch as Meta leveraged IG to supercharge growth but has fewer DAU than ChatGPT does today so it does seem a bit odd to suggest Threads is growing faster than ChatGPT or even has.

In fairness to your point, "fastest growing consumer product ever" isn't well defined per se. If a consumer product gets 1 sign up and then 1e-32 seconds later it gets a second, maybe THAT'S the "fastest growing ever".

> On the other hand, OpenAI’s ChatGPT has more intense pressure from competitors, isn’t making any money, and costs are still rising for its operation.

This isn't the first time someone has suggested that the social media space is low competition but I always think it's completely incorrect. Threads competes with IG, FB, WA, Snap, TikTok, Bluesky, X and many more. All are well funded and most have comparable or greater DAU.

In terms of making money, Threads only recently began to show ads at all. During this time, Threads has also been cannibalizing IG engagement. It's quite UNLIKELY that IG is making very much money and even less likely that on the whole has been a positive revenue tailwind for Meta even with the actual revenue they are starting to book. Meanwhile ChatGPT, which apparently isn't making any money, has a revenue run rate of ~$12B.

> I don’t know where you’re getting the idea that it’s the “fastest growing consumer product ever” and “revenues are like super exponential” as it’s demonstrated repeatedly that OpenAI has yet to turn a profit or even meaningfully dent their burn rate

Revenue != profit. Their revenue is growing unbelievably fast. Their user base has and is still growing extremely fast. Current revenue run rate ~$12B which is >3X 2024 total revenue. That is enormously fast especially at that already sizeable ~$4B base!

That their losses are so large is in large part a choice as they provide a huge amount of inference for free today. The same argument you're making about losses is the same tired one people made about FB, Uber, Amazon and plenty of other high growth companies all of which are highly profitable today.

Anyone sitting around arguing that LLMs are a low-demand product that "hardly anyone wants" is one or more of an idiot, willfully ignorant, highly misinformed by a trusted source or actively spreading bollocks.

bee_rider 6 days ago | parent | prev | next [-]

The first artificial intelligence was the market. Just like the rest, it produces some wonderful tools, but putting it in charge of everything is a sure path to some kind of paperclip optimizer.

antisthenes 6 days ago | parent | prev [-]

> Now we are spending $trillions on generating idle talk.

We were spending $trillions on generating idle talk before AI as well. It was just done by meatbags.