Remix.run Logo
As AI gobbles up chips, prices for devices may rise(npr.org)
299 points by geox a day ago | 442 comments
hansmayer 7 hours ago | parent | next [-]

I cannot stop thinking about the LLMs having this Midas touch quality, because everything they touch seems to ruin things or make people want to avoid them, for example:

- Ghibli studio style graphics,

- the infamous em-dashes and bullet points

- customer service (just try to use Klarnas "support" these days...)

- Oracle share price ;) - imagine being one of the worlds most solid and unassailable tech companies, losing to your CEOs crazy commitment to the LLMs...

- The internet content - We now tripple check every Internet source we dont know to the core ...

- And now also the chips ?

Where does it stop? When we decide to drop all technology as it is?

bogzz 7 hours ago | parent | next [-]

I am not sure how, or even if, it does stop. I assume once the hot air from LLM company CEOs starts being treated as the flatulence that it is, things will wind down. The sentiment against generated content is not going away.

api 4 hours ago | parent [-]

In previous eras there were many purists who considered photography not-art, sequencer and synthesizer made music not-music, other forms of (non-AI) digital art less legitimate than their more manual classical counterparts, etc. This is the same discourse all over again.

Is electronic music where the artist composes it on a screen and then hits 'play' music? I think it is, of course, but I have had experiences where I went to see a musician "live" and well... they brought the laptop with them. But I think it still counts. It was still fun.

AI slop is to AI art what point and shoot amateur photography is to artistic photography. The difference is how much artistic intent and actual work is present. AI art has yet to get people like Ansel Adams, but it will -- actual artists who use AI as a tool to make novel forms and styles of art.

(I used an emdash!)

This is an outstanding read: https://medium.com/@aaronhertzmann/how-photography-became-an...

Anti-photography discourse sounds exactly like anti-AI discourse to the point that you could search and replace terms and have the same rants.

Another thing I expect to see is novelists using AI to create at least passable live action versions of their stories. I don't think these will put real actors or actresses out of work for a long time, but I could see them serving as "sizzle reels" to sell a real production. If an author posts their AI-generated film of their novel and it gets popular, I could see a studio picking it up and making a real movie or TV show from it.

RiverCrochet 4 hours ago | parent | next [-]

> Is electronic music where the artist composes it on a screen and then hits 'play' music?

If X composes something, X is an artist. The person playing a composed work is a performer. Some people have both the roles of artist and performer for a given work.

To say an AI composes something is anthropomorphizing a computer. If you enter a prompt to make a machine generate work based on existing artists' art, you're not composing (in the artistic sense) and neither is the computer. Math isn't art even if it's pretty or if mathematical concepts are used in art.

The term "director" instead of composer or artist conveys what's happening a lot better with telling machines to generate art via prompts.

chowells an hour ago | parent | next [-]

I mostly agree with your sentiment, but saying "math is not art" is the same as saying "writing is not art". Calculation isn't art. But math isn't calculation. Math is a social activity shared between humans. Like writing, much of it is purely utilitarian. But there's always an aesthetic component, and some works explore that without regard to utility. It's a funny kind of art, accessible to few and beautiful to even fewer. But there is an art there.

RiverCrochet an hour ago | parent | next [-]

This really made me think and you're right. Perhaps I should have said "calculation" instead of "math."

worik an hour ago | parent | prev [-]

When it comes to art, description is after practice

It does not matter if they are labeled "composer" or "director ". It is the product that counts.

"....I know what I like"

mattmanser 2 hours ago | parent | prev [-]

The vast majority of artists in all fields don't really have their own style and are just copying other people's. Doesn't matter whether we're talking about art, literature, music, film, whatever.

It takes a rare genius to make a new style, and they come along a few times a generation. And even they will often admit they built on top of existing styles and other artists.

I'm not a fan of AI work or anything, but we need to be honest about what human 'creativity' usually is, which for most artists is basically copying the trends of the time with at most a minor twist.

OTOH, I think when you start entering the fringes of AI work you really start seeing how much it's just stealing other people's work though. With more niche subjects, it will often produce copies of the few artists in that field with a few minor, often bad, changes.

matthewkayin an hour ago | parent [-]

Sure, you can say that AI is just "stealing like an artist", but that makes the AI the artist in this scenario, not the prompter.

It bothers me that all of the AI "artists" insist that they are just the same as any other artist, even though it was the AI that did all of the work. Even when a human artist is just copying the styles they've seen from other artists, they still had to put in the effort to develop their craft to make the art in the first place.

smoe an hour ago | parent | prev | next [-]

I'm not against AI art per se, but at least so far, most “AI artists” I see online seem to care very little about the artistry of what they’re doing, and much much more about selling their stuff.

Among the traditional artists I follow, maybe 1 out of 10 posts is directly about selling something. With AI artists, it’s more like 9 out of 10.

It might take a while for all the grifters to realize that making a living from creative work is very hard before more genuinely interesting AI art starts to surface eventually. I started following a few because I liked an image that showed up in my feed, but quickly unfollowed after being hit with a daily barrage of NFT promotions.

bogzz 4 hours ago | parent | prev | next [-]

I don't believe that there is near enough room for creativity to shine through in the prompt-generation pipeline, and I find the mention of a talent like Ansel Adams in this context asinine. There is no control there, and without control over creation I don't believe that creativity CAN flourish, but I may be wrong.

Electronic music is analogous to digital art made by humans, not generated art.

LogicFailsMe 3 hours ago | parent | next [-]

Defining art in this way is like defining intelligence as the possession of a degree from Stanford. It's just branding.

Art shouldn't make you feel comfortable and safe. It should provoke you and in this sense AI art is doing the job better than traditional art at the moment here.

Other than the technological aspect, there's nothing new under the sun here. And at its very worst, AI art is just Andy Warhol at hyperscale.

https://wbpopphilosopher.wordpress.com/2023/05/07/andy-warho...

caconym_ 2 hours ago | parent | next [-]

I think it's actually quite apt to look at all of "AI art" as a single piece, or suite, with a unified argument or theme. Maybe in that sense it is some kind of art, even if it wasn't intended that way by its creators.

Similarly, I'm not sure that argument is making the point those who deploy it intend to make.

LogicFailsMe 2 hours ago | parent [-]

I think the entire fear of AI schtick to farm engagement is little more than performance art for our FAANNG overlords personally. It behaves precisely like the right wing manosphere but with different daily talking points repeated ad nauseum. Bernie Sanders has smelled the opportunity here and really stepped up his game.

But TBF, performance art theatre is art as well.

The end game IMO will be incorporation of AI art toolsets into commercial art workflows and a higher value placed on 100% human art (however that ends up being defined) and then we'll find something new and equally idiotic to trigger us or else we might run out of excuses and/or scapegoats for our malaise.

caconym_ 2 hours ago | parent [-]

> incorporation of AI art toolsets into commercial art workflows and a higher value placed on 100% human art

I don't even really believe serious artists need to totally exclude themselves from using genAI as a tool, and I've heard the same from real working artists (generally those who have established careers doing it). Unfortunately, that point inhabits the boring ideological center and is drowned out by the screaming from both extremes.

LogicFailsMe an hour ago | parent [-]

They aren't, but some are already using pseudonyms to experiment with it to avoid the haters condemning them for doing so. And their work is predictably far superior from the get-go to asking Sora to ghiblify your dog.

bogzz 3 hours ago | parent | prev [-]

I Ghiblified a photo of my dog when chatgpt 4 came out. I was utterly horrified by the results.

It's exciting being able to say that I am an artist, I always wondered what my life would have been had I gone into the arts, and now I can experience it! Thank you techmology.

LogicFailsMe 3 hours ago | parent | next [-]

If you really want to experience the struggles and persecution of an artist, you should empty your bank account and find a life partner to support you while you struggle with your angst and inner trauma that are the source of your creativity. But, to be fair, complaining about AI art is a great start down that path!

bogzz 3 hours ago | parent [-]

Logic might fail you, but snark is Ol' Faithful it seems.

LogicFailsMe 3 hours ago | parent [-]

How else would you address the incessant ramblings of people who figuratively curse the sunset daily? After AI art has been integrated into the already existing suite of digital art applications (which themselves were once not considered art), whatever shall you complain about next?

Now if you wanted to define art to require 100% bodily fluids and solids 100% handcrafted to be the only real art, now that I'd understand.

yunwal 3 hours ago | parent | prev [-]

what you did was not even close to an attempt at making good art.

IlliOnato 4 hours ago | parent | prev | next [-]

You may check these videos by Oleg Kuvaev. 100% generated using AI. Everything: text, music, characters, voices, editing -- all done via prompts, using multiple engines (I think he mentioned about a dozen services involved). I would not call it "high art", but it's definitely not a slop, it's an artist skillfully using AI as a tool.

https://youtu.be/A2H62x_-k5Q?si=EHq5Y4KCzBfo0tfm

https://youtu.be/rzCpT_S536c?si=pxiDY4TPhF_YLfRc

https://youtu.be/wPVe365vpCc?si=AqhpaZHYb4ldSf3F

https://youtu.be/EBaGqojNJfc?si=1CoLn4oeNxK-7bpe

shayway 4 hours ago | parent [-]

While we're sharing AI generated videos, IGORRR's ADHD music video [0] is definitively art, zero question about it. I don't think typing a prompt in and taking the output as it comes is art -- good art, anyway (the point-and-shoot photography comparison is apt) -- but that doesn't mean AI can't be used to make truly new, creative and unique art too.

[0] https://www.youtube.com/watch?v=TGIvO4eh190 (warning, lots of disturbing imagery)

Pet_Ant 4 hours ago | parent | prev | next [-]

> I don't believe that there is near enough room for creativity to shine through in the prompt-generation pipeline

I mean you are building a prompt and tweaking it. I mean even if you didn't do that you could still argue that finding it is in itself a creative akin to found art [1].

[1] https://en.wikipedia.org/wiki/Found_object

bogzz 4 hours ago | parent [-]

I suppose. You're "finding" something that didn't exist and that nobody ever cared about. Something that you wrote, mashed against the tensors trained on real artist creations, and out came the thing that you "found".

I'm genuinely amazed at how some people perceive art.

Pet_Ant 3 hours ago | parent [-]

To me art has always been "an interesting idea". Decorative things that take skill to me are crafts. Sure, it's a water color of your garden, but what does it tell us about the human condition? Sure, it's skilled... but it's empty. Give me Jackson Pollock or Picasso. Give me a new way to see the world. Pure skill to me is as impressive as cup-stacking personally.

Not saying you have to agree, but it is a distillation of how some portion of the world sees the world.

api 4 hours ago | parent | prev | next [-]

How much room for creativity is there with a camera? Angle, lighting, F-stop, film type, film processing? I have a local image generator app called Draw Things that has many times more options than this.

Early synthesizers weren't that versatile either. Bands like Pink Floyd actually got into electronics and tore them apart and hacked them. Early techno and hip-hop artists did similar things and even figured out how to transform a simple record player into a musical instrument by hopping the needle around and scratching records back and forth with tremendous skill.

https://www.youtube.com/watch?v=NnRVmiqm84k

https://www.youtube.com/watch?v=ekgpZag6xyQ

Serious AI artists will start tearing apart open models and changing how they work internally. They'll learn the math and how they work just like a serious photographer could tell you all about film emulsions and developing processes and how film reacts to light.

Art's never about what it does. It's about what it can do.

caconym_ 2 hours ago | parent | next [-]

> How much room for creativity is there with a camera? Angle, lighting, F-stop, film type, film processing?

How many subjects exist in the world to be photographed? How many journeys might one take to find them? How many stories might each subject tell with the right treatment?

> Serious AI artists will start tearing apart open models and changing how they work internally. They'll learn the math and how they work just like a serious photographer could tell you all about film emulsions and developing processes and how film reacts to light.

I agree that "AI art" as it exists today is not serious.

bogzz 4 hours ago | parent | prev [-]

I do not think that the things you say will happen, will ever happen.

Also, photography has the added benefit of documenting the world as it is, but through the artist's lens. That added value does not exist when it comes to slop.

CamperBob2 3 hours ago | parent [-]

I do not think that the things you say will happen, will ever happen.

When's the last time someone who said something like that was right?

CamperBob2 3 hours ago | parent | prev [-]

I don't believe that there is near enough room for creativity to shine through in the prompt-generation pipeline

You seem so sure that you'll always be able to tell what you're looking at, and whether it's the result of prompting or some unspecified but doubtlessly-noble act of "creativity."

LOL. Not much else can be said, but... LOL.

xgulfie 3 hours ago | parent | prev | next [-]

> AI slop is to AI art what point and shoot amateur photography is to artistic photography.

Sorry... It's all slop buddy. The medium is the message, and genAI's message is "I want it cheap and with low effort, and I don't care too much about how it looks"

sdwr 3 hours ago | parent | next [-]

So art is just a status signifier? "This is hard to make so I must be really special"?

caconym_ 2 hours ago | parent | next [-]

It is more useful to think about it in terms of what that effort actually entails.

If you haven't ever written a novel, or even a short story, you cannot possibly imagine how much of your own weird self ends up in it, and that is a huge part of what will make it interesting for people to read. You can also express ideas as subtext, through the application of technique and structure. I have never reached this level with any form of visual art but I imagine it's largely the same.

A prompt, or even a series of prompts, simply cannot encode such a rich payload. Another thing artists understand is that ideas are cheap and execution is everything; in practice, everything people are getting out of these AI tools is founded on a cheap idea and built from an averaging of everything the AI was trained on. There is nothing interesting in there, nothing unique, nothing more than superficially personal; just more of the most generic version of what you think you want. And I think a lot of people are finding that that isn't, in fact, what they want.

irishcoffee 3 hours ago | parent | prev [-]

Uh, yes?

majormajor 2 hours ago | parent [-]

"This is hard to make" hasn't been the distinguishing factor for popular/expensive/trendy art for a long time.

There is a literal cliche "my six year old could've done this" about how some of the techniques do not require the years of training they used to.

And a literal cliche response about how the eye and execution is the current determining factor: "but they didn't."

llbbdd 3 hours ago | parent | prev | next [-]

Just like photography

yunwal 3 hours ago | parent | prev | next [-]

For fun I decided to try out the find and replace on this comment

> Sorry... It's all slop buddy. The medium is the message, and photography's message is "I want it cheap and with low effort, and I don't care too much about how it looks"

Hmm... it seems like you have failed to actually make an argument here

sodapopcan 3 hours ago | parent [-]

So fun.

Photography is neither cheap nor low effort. Ask AI about it.

CuriouslyC 3 hours ago | parent | prev [-]

What a barren viewpoint.

The logical implication of your view is that if someone or something has a halo, they can shit in your mouth and it's "good." The medium is the message, after all.

This is the same pretentious art bullshit that regular people fucking hate, just repackaged to take advantage of public rage at tech bro billionaires.

lawlessone 2 hours ago | parent | prev | next [-]

Keyboards have had functions that let them play music at the touch of button for decades.

Decades later we still don't consider anyone using that function a musician.

>actual artists who use AI as a tool to make novel forms and styles of art.

writing a prompt lol

We don't compare Usain Bolt to Lewis Hamilton when talking about fastest runners in the world.

But hey think about how much money you could save on a wedding photographer if you just generate a few images of what the wedding probably looked like!

b00ty4breakfast 2 hours ago | parent | prev [-]

THING IN PAST SIMILAR MUST BE SAME THING

you enjoy your industrial effluent, I'm gonna stick to human artists making art

bluSCALE4 2 hours ago | parent [-]

Whatever, man, this guy isn't wrong. Look at the example he gave how a camera made it so that anyone could do what only a few could. Novel art is just a candid shot now. It forced art to completely change its values. Much of the same will happen now. The difference is that with the past, we still needed artists to take advantage of them while now, it all can be completely automated. It's disgusting but I'm sure purest thought the same of every innovation.

pronik 5 hours ago | parent | prev | next [-]

I'm not sure people remember when PCs and inkjet printers became affordable while MS Word added cliparts at around the same time. Those black figurines with a light bulb above them while some text was written above or below in either Comic Sans or 3D "word art" were absolutely everywhere. Digital typesetting was bad when it started (see Donald Knuth's rant about it, leading to TeX), but you have to imagine the horror of normal people trying to layout stuff in Word all of the sudden without a hint of competence. This is exactly what happens right now with LLMs, some people will find the right amount of usage, the others won't, but that's OK. The problem back the wasn't MS Word per se (bar some stupid defaults Microsoft had borked completely), neither are LLMs inherently the problem right now. We are in the seemingly never-ending hype cycle, but even that will pass.

lithocarpus 3 hours ago | parent | prev | next [-]

Some people are dropping things in response to how things are being ruined. Many people are not.

I hope you're right but I imagine with more computing power used more efficiently, the big companies will hoard more and more of the total available human attention.

RiverCrochet 4 hours ago | parent | prev | next [-]

> Where does it stop? When we decide to drop all technology as it is?

Whenever you want.

Of course you can't directly control what other people do or how much they use t0echnology. But you have lots of direct control over what you use, even if it's not complete control.

I stopped taking social media seriously in the early 2010's. I'm preparing for a world of restricted, boring, corporate, invasive Internet, and developing interests and hobbies that don't rely on tech. We've had mechanisms to network people without communications tech for thousands of years, it's probably time to relearn those (the upper classes never stopped using them). The Internet will always be there, but I don't have to use it more than my workplace requires, and I can keep personal use of it to a minimum. Mostly that will mean using the Internet to coordinate events and meeting people and little else.

drBonkers 4 hours ago | parent | next [-]

> it's probably time to relearn those (the upper classes never stopped using them)

Can you tell me more about these? I’m actively trying to find ways to cultivate my community.

smashem 4 hours ago | parent [-]

Face-to-face social gatherings - parties, dinners, clubs, meetups

Membership organizations - country clubs, professional associations, alumni networks, charitable boards

Personal introductions and referrals - being introduced through mutual acquaintances

Cultural and civic participation - involvement in local institutions, community organizations, religious groups

nradov 2 hours ago | parent | next [-]

There are also private group chats open only to selected elite and wealthy people. When you see several prominent people suddenly make similar public statements on a particular issue there's a good chance they used those group chats behind the scenes to coordinate messaging.

piglet_bear 2 hours ago | parent | prev [-]

Ha, I can only wish. Maybe true if you live in NYC, SF, Berlin or London.

But most of these don't exist or help with socializing and making new connections where I live (medium sized European university city).

Everyone here only hangs out with their family and school/university mates and that's it. Any other available events are either for college students or lonely retirees but nothing in between.

RiverCrochet 2 hours ago | parent [-]

> Everyone here only hangs out with their family and school/university mates and that's it.

If you can get a few people from 2 of these groups together more than once, you've started solving this problem. Of course keeping it going for a long time is a challenge, and you want to avoid always being in the situation where you are doing all the work and others aren't contributing, but it gets easier and better with experience.

piglet_bear 21 minutes ago | parent [-]

Except that if you're not anyone's family and not in university anymore then you're shit out of luck as people in their 30s already have their social circles already completed and don't have space, time and energy to add new strangers when they barely have free time to hang out with their existing clique.

yowlingcat 4 hours ago | parent | prev [-]

I agree with you and I will also anecdotally note that I've been personally observing more and more of the younger generations (Z, esp gen Alpha) adopt these mechanisms en masse, viewing social media as the funhouse simulation of socialization that it always was and finding true social connection through other manners.

windexh8er 7 hours ago | parent | prev | next [-]

> Where does it stop? When we decide to drop all technology as it is?

It doesn't stop. This is because that it's not the "technology" driving AI. You already acknowledged the root cause: CEOs. AI could be great, but it's currently being propped up by sales hype and greed. Sam wants money and power. Satya and Sundar want money and power. Larry and Jensen want to also cash in on this facade that's been built.

Can LLMs be impactful? For sure. They are now. They're impacting energy consumption, water usage, and technology supply chains in detrimental ways. But that's because these people want to be the ones to sell it. They want to be the ones to cash in. Before they really even have anything significantly useful. FOMO in this C-suite should be punishable in some way. They're all charlatans to different degrees.

Blame the people behind this propping up this mess: the billionaires.

imglorp 6 hours ago | parent | next [-]

The CEO quiet part out loud was very clearly: "salaries".

This will scale back when AI replacement attempts slow down as expectations temper (Salesforce, Klarna, etc).

xorcist 4 hours ago | parent | next [-]

The weird thing is that the AI companies themselves are hiring like there's no tomorrow, doing talent aquisitions etc. Why would you do that if the purpose of your product is to reduce necessary workforce?

Why isn't that the first question that comes to mind for a journalist covering the latest acquisition? It's like an open secret that nobody really talks about.

marcosdumay 15 minutes ago | parent | next [-]

To answer your questions (I don't think it's what you wanted, but people will scratch their heads after reading them):

On reality, they are hiring because they have a lot of (investment) money. They need a lot of hardware, but they also need people to manage the hardware.

On an alternative reality where their products do what they claim, they would also hire, because people working there would be able to replace lots of people working in other jobs, and so their workers would be way more valuable than the average one, and everybody would want to buy what they create.

Journalists don't care about it because whatever they choose to believe or being paid to "believe", it's the natural way things happen.

ThunderSizzle 4 hours ago | parent | prev [-]

Everyone is trying to be the shovel sales person for AI, not the gold diggers buying shovels.

I'm not sure if even the LLM companies themselves are selling shovels yet. I think everyone is racing to find what the shovel of LLMs are.

xorcist 4 hours ago | parent [-]

It was collectively decided some time ago that this particular shovel is called nVIDIA.

ThunderSizzle 3 hours ago | parent [-]

That was decided when crypto mining became too expensive, I guess.

compiler-devel 6 hours ago | parent | prev [-]

It's always about cutting the OpEx spend. Companies are nothing more than giant piles of money seeking to grow themselves in any way possible.

sumedh 4 hours ago | parent | prev | next [-]

> Sam wants money and power.

I think all the AI companies want to be the first to say they have achieved AGI, that moment will be in the history books.

veegee 6 hours ago | parent | prev [-]

Real consequences need to be implemented such as prison time or ideally death penalty. But sadly we’ll never see that happen

racl101 an hour ago | parent | prev | next [-]

We need a word for a Midas touch except instead of gold everything turns to feces.

zappb an hour ago | parent | next [-]

That’s the Mierdes Touch.

marcosdumay 34 minutes ago | parent | prev [-]

We just need people capable of understanding it's a curse.

.. and maybe to ignore whoever can't.

tqi 3 hours ago | parent | prev [-]

Oracle was "one of the worlds most solid and unassailable tech companies"?

voidfunc 2 hours ago | parent [-]

The Oracle DB moat is big. Like Ocean-sized big.

torginus 13 hours ago | parent | prev | next [-]

It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate), can easily price the rest of humanity out of computing goods.

marcosdumay 3 minutes ago | parent | next [-]

Yes. To put it in slightly different terms, it's alarming that a handful of companies can price all of humanity out of computing goods. And it's even more alarming that those companies don't even need to be profitable.

palmotea 13 hours ago | parent | prev | next [-]

> It's somewhat alarming to see that companies (owned by a very small slice of society) ... can easily price the rest of humanity out of computing goods.

If AI lives up to the hype, it's a portent of how things will feel to the common man. Not only will unemployment be a problem, but prices of any resources desired by the AI companies or their founders will rise to unaffordability.

torginus 13 hours ago | parent | next [-]

I think living up to the hype needs to be defined.

A lot of AI 'influencers' love wild speculation, but lets ignore the most fantastical claims of techno-singularity, and let's focus on what I would consider a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.

Even in this scenario, the capital gains on the lump sump invested in AI far outpaces the money that would be spent on the salaries of these workers, and if we look at the scenario with investor goggles, due to the exponential nature of investment gains, the gap will only grow wider.

Additionally, AI does not seem to be a monopoly, either wrt companies, or geopolitics, so monopoly logic does not apply.

latexr 9 hours ago | parent | next [-]

> A lot of AI 'influencers' love wild speculation

You mean like Sam Altman, who repeatedly claimed AI will cure all cancers and diseases, solve the housing crisis, poverty, and democracy? I was going to add erectile disfunction as a joke, but then realised he probably believes that too.

https://youtu.be/l0K4XPu3Qhg?t=60

It’s hard to point fingers at “AI influencers”, as if they’re a fringe group, when the guy who’s the face of the whole AI movement is the one making the wild claims.

Mountain_Skies 5 hours ago | parent [-]

Elon Musk is in on that game too, promising post scarcity fully automated luxury space communism "in a few years" if we as society give him all of the resources he wants from us to make nirvana happen. No need to work and everything is free, as long as we trust him to make it happen.

xorcist 4 hours ago | parent [-]

He says a lot of things. We also need to vote for separatist parties across Europe for that to happen. Not at all clear why, unless someone confused nirvara and apartheid.

wickedsight 10 hours ago | parent | prev | next [-]

> a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.

I'm really interested in what will happen to the economy/society in this case. Knowledge workers are the market for much that money is being made on.

Facebook and Google make most of their money from ads. Those ads are shown to billions of people who have money to spend on things the advertisers sell. Massive unemployment would mean these companies lose their main revenue stream.

Apple and Amazon make most of their money from selling stuff to millions of consumers and are this big because so many people now have a ton of disposable income.

Teslas entire market cap is dependent on there being a huge market for robo taxis to drive people to work.

Microsoft exists because they sell an OS that knowledge workers use to work on and tools they use within that OS to do the majority of their work with. If the future of knowledge work is just AI running on Linux communicating through API calls, that means MS is gone.

All these companies that currently drive stock markets and are a huge part of the value of the SP500 seem to be actively working against their own interests for some reason. Maybe they're all banking on being the sole supplier of the tech that will then run the world, but the moat doesn't seem to exist, so that feels like a bad bet.

But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.

latexr 9 hours ago | parent | next [-]

> But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.

Don’t forget Sam Altman publicly said they have no idea how to make money, and their brilliant plan is to develop AGI (which they don’t know how and aren’t close to) then ask it how to generate revenue.

https://www.startupbell.net/post/sam-altman-told-investors-b...

Maybe this imaginary AGI will finally exist when all of society is on the brink of collapse, then Sam will ask it how to make money and it’ll answer “to generate revenue, you should’ve started by not being an outspoken scammer who drove company-wide mass hysteria to consume society. Now it’s too late. But would you like to know how may ‘r’ are in ‘strawberry’?”.

https://www.newyorker.com/cartoon/a16995

input_sh 8 hours ago | parent | prev | next [-]

Some years (decades?) ago, a sysadmin like me might half-jokingly say: "I could replace your job with a bash script." Given the complexity of some of the knowledge work out there, there would be some truth to that statement.

The reason nobody did that is because you're not paying knowledge workers for their ability to crunch numbers, you're paying them to have a person to blame when things go wrong. You need them to react, identify why things went wrong and apply whatever magic needs to be applied to fix some sort of an edge case. Since you'll never be able to blame the failure on ChatGPT and get away with it, you're always gonna need a layer of knowledge workers in between the business owner and your LLM of choice.

You can't get rid of the knowledge workers with AI. You might get away with reducing their size and their day-to-day work might change drastically, but the need for them is still there.

Let me put it another way: Can you sit in front of a chat window and get the LLM to do everything that is asked of you, including all the experience you already have to make some sort of a business call? Given the current context window limits (~100k tokens), can you put all of the inputs you need to produce an output into a text file that's smaller in size than the capacity of a floppy disc (~400k tokens)? And even if the answer to that is yes, if it weren't for you, who else in your organization is gonna write that file for each decision you're the one making currently? Those are the sort of questions you should be asking before you start panicking.

andy99 9 hours ago | parent | prev | next [-]

AI won’t replace knowledge workers, it will just give them different jobs. Pre AI, huge swaths of knowledge workers could just be replaced with nothing, they are a byproduct of bureaucratic bloat. But these jobs continue to exist.

Most white collar work is just a kind of game people play, it’s in to way needed, but people still enjoy playing it. Having AI writing reports nobody reads instead of people doing it isn’t going to change anything.

oblio 9 hours ago | parent [-]

> AI won’t replace knowledge workers, it will just give them different jobs.

Yeah, and those new jobs will be called "long term structural unemployment", like what happened during deindustrialization to Detroit, the US Rust Belt, Scotland, Walloonia, etc.

People like to claim society remodels at will with almost no negative long term consequences but it's actually more like a wrecking ball that destroys houses while people are still inside. Just that a lot of the people caught in those houses are long gone or far away (geographically and socially) from the people writing about those events.

andy99 8 hours ago | parent [-]

I’m not saying society will remodel, I’m saying the typical white collar job is already mostly unnecessary busywork anyway, so automating part of that doesn’t really affect the reasons that job exists.

theappsecguy 6 hours ago | parent | next [-]

How do you determine that a typical job is busy work? While there are certainly jobs like that, I don’t really see them being more than a fraction of the total white collar labour force.

InfamousRece 4 hours ago | parent [-]

Yeah that kind of thinking is known as “doorman fallacy”. Essentially the job whose full value is not immediately obvious to ignorant observer = “useless busy work”.

hylaride 5 hours ago | parent | prev [-]

Except people now have an excuse to replace those workers, whereas before management didn't know any better (or worse were not willing to risk their necks).

The funny/scary part is that people are going to try really hard to replace certain jobs with AI because they believe in the hype and not because AI may actually be good at it. The law industry (in the US anyways) spends a massive amount of time combing through case law - this is something AI could be good at (if it's done right and doesn't try and hallucinate responses and cites sources). I'd not want to be a paralegal.

But also, funny things can happen when productivity is enhanced. I'm reminded of a story I was told by an accounting prof. In university, they forced students in our tech program to take a handful of business courses. We of course hated it being techies, but one prof was quite fascinating. He was trying to point out how amazing Microsoft Excel was - and wasn't doing a very good job of it to uncaring technology students. The man was about 60 and was obviously old enough to remember life before computer spreadsheets. The only thing I remember from the whole course is him explaining that when companies had to do their accounting on large paper spreadsheets, teams of accountants would spend weeks imputing and calculating all the business numbers. If a single (even minor) mistake was made, you'd have to throw it all out and start again. Obviously with excel, if you make a mistake you just correct it and excel automatically recalculates everything instantly. Also, year after year you can reuse the same templates and just have to re-enter the data. Accounting departments shrank for awhile, according to him.

BUT they've since grown as new complex accounting laws have come into place and the higher productivity allowed for more complex finance. The idea that new tech causes massive unemployment (especially over the longer term) is a tale that goes back to luddite riots, but society was first kicked off the farm, then manufacturing, and now...

worik 38 minutes ago | parent | next [-]

AI can't do your job

Your boss hired an AI to do your job

You're fired

oblio 3 hours ago | parent | prev [-]

Do you assume that the average HN commenter hasn't heard of the Luddites?

Go read what happened to them and their story. They were basically right.

Also, why do you think I mentioned those exact deindustrialization examples?

Your comment is the exact type of comment that I was aiming at.

Champagne/caviar socialist. Or I guess champagne capitalist in this case.

ptero 8 hours ago | parent | prev [-]

I don't know why you are getting downvoted. While I might agree or disagree with the argument, it is a clear, politely expressed view.

It is sad HN is sliding in the direction of folks being downvoted for opinions instead of the tone they use to express them :(

nothrabannosir 6 hours ago | parent [-]

I agree with you, but:

> I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness.

- Paul Graham, 2008

https://news.ycombinator.com/item?id=117171

ptero 4 hours ago | parent [-]

That view is about 18 years old and HN was very different then.

As with any communication platform it risks turning into an echo chamber, and I am pretty sure that particular PG view has been rejected for many years (I think dang wrote on this more than once). HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that.

For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see. It is not too bad at HN yet, but the acceptance of the downvote for disagreement is the strongest thing that pushes HN from discussions of curious individuals towards the blah-quality of "who gets more supporters" goals of the modern social media. My 2c.

irishcoffee 3 hours ago | parent [-]

> HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that.

> For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see.

These two statements don't seem to agree with each other.

ptero 40 minutes ago | parent [-]

Why? Work hard doesn't mean fully succeed.

HN policies and algorithms slow the slide, and keep it better than reddit, but the set of topics that allow one to take a minority opinion without downvoting keeps shrinking. At least compared to the time 10-15 years ago.

dartharva 10 hours ago | parent | prev [-]

> Even in this scenario, the capital gains on the lump sump invested in AI far outpaces the money that would be spent on the salaries of these workers, and if we look at the scenario with investor goggles, due to the exponential nature of investment gains, the gap will only grow wider.

Interesting hypothesis, do you have the math to back it up?

elorant 10 hours ago | parent | prev | next [-]

Affordable computing is what created the economy. If you take that away people in poorer countries can no longer afford a phone. Without a phone a lot things that we consider a given will not be functional anymore. The gaming industry alone including phones is a whooping $300bn. This will take a significant hit if people have to pay a fortune to build a rig, or if their phones are so under-powered that they can't even play a decent arcade game. Fiber is not universal so that all of this to be transferred to the cloud. We tend to forget that computing is universal and it's not just PCs.

Imustaskforhelp 10 hours ago | parent | next [-]

I really agree with your statement and people forget, but the reason third world countries are able to buy devices is because they are cheap, increase the ram price and thus every computing device and I think it will impact everyone of us but disproportionately due to power purchasing capacity and other constraints in an economy

I genuinely hope that this ram/chips crisis gets solved ASAP by any party. The implications of this might have a lot of impact too and I feel is already a big enough financial crisis itself if we think about it coupled with all the other major glaring issues.

compounding_it 10 hours ago | parent | next [-]

Based on the article, demand exceeds supply by 10%. It seems that companies are taking advantage of this gap nothing else. I won't be surprised if the demand is kept this way for a while to extract profits. GPUs saw a similar trend during crypto. Then there were affordable GPUs at one point.

walterbell 9 hours ago | parent [-]

"OMEC" (Organization of Memory Exporting Countries) NAND production quotas lowered by ~10%? https://x.com/jukanlosreve/status/1988505115339436423

  Samsung Electronics has lowered its target for NAND wafer output this year to around 4.72 million sheets, about 7% down from the previous year's 5.07 million. Kioxia also adjusted its output from 4.80 million last year to 4.69 million this year.. SK hynix and Micron are likewise keeping output conservatively constrained in a bid to benefit from higher prices. SK hynix's NAND output fell about 10%, from 2.01 million sheets last year to around 1.80 million this year. Micron's situation is similar: it is maintaining production at Fab 7 in Singapore—its largest NAND production base—in the low 300,000-sheet range, keeping a conservative supply posture.
China's YMTC and CXMT are increasing production capacity, but their product mix depends on non-market inputs, https://thememoryguy.com/some-clarity-on-2025s-ddr4-price-su...

  The Chinese government directed CXMT to convert production from DDR4 to DDR5 as soon as the company was able. The order was said to have been given in the 4th quarter of 2024, and the price transition changed from a decrease to an increase in the middle of March 2025.. A wholesale conversion from DDR4 to DDR5 would probably be very expensive to perform, and would thus be unusual for a company that was focused on profitability. As a government-owned company, CXMT does not need to consistently turn a profit, and this was a factor in the government’s decision to suddenly switch from DDR4 to DDR5.
andrekandre 8 hours ago | parent [-]

   > bid to benefit from higher prices
et tu 'law of supply and demand'?
andruby 6 hours ago | parent [-]

Constrict the supply, and price goes up. It works like textbook economics.

Maybe I'm misinterpreting "et tu" here.

Or maybe you meant "free markets" instead. Modern RAM production requires enormous R&D expenses, and thus has huge moat, which means the oligopoly is pretty safe (at least in the short to medium term) from new entrants. They "just" need to keep each other in check because there will be an incentive to increase production by each individual participant.

I do like the "OMEC" name as a paralel for OPEC.

torginus 10 hours ago | parent | prev [-]

Old devices work just fine. I've upgraded my old iPhone XS last year to the latest and greatest 16 to see what changed (not much), the old one was still fast (in fact faster than a most upper-midrange Androids, its insane how much of a lead Apple has) and the battery was good. I considered selling it, but quickly had to realize it was worth almost nothing.

Also, when treated right, computers almost never break.

There's so much hand-me-down stuff, that are not much worse than the current stuff, that I think people even in the poorest countries can get an okay computer or smartphone (and most of them do).

Imustaskforhelp 10 hours ago | parent [-]

Well the industries the most impacted by it are homelabbing/datacenters imo.

Like in current circumstances, its hard to get a homelab/datacenter so its better to postpone these plans for sometime

I agree with your statement overall but I feel like till the years that these ram shortages occur, there is a freeze of all companies providing vps's etc. ie. no new player can enter so I am a bit worried about those raising their prices as well honestly which will impact everyone of us as well for these few years in another form of AI tax

GCUMstlyHarmls 9 hours ago | parent [-]

Bleh. I was already sad but I hadn't really thought about that specific impact, I can imagine smaller (read: small to big) VPS providers will be forced to raise prices while meta providers (read: AWS) can probably stomach the cost and eat even more of the market.

Imustaskforhelp 9 hours ago | parent [-]

Exactly. I was thinking of building my own VPS provider on the pain points of development I felt and my father works in broadband business and has his own office and I was thinking of setting up a very small thing there almost the same hardware-alike of homelabbing

But the ram prices themselves are the reason I am forced to not enter this industry for the time being. I have decided right now to save my money for the time / focus on the job/college aspect of things to earn more so that when the timing is right, I would be able to invest my own money into it.

But basically Ram prices themselves are the thing which force us out of this market for the most part. I researched a lot about datacenters recently/ the rabbit hole and as previous hardware gets replaced/new hardware gets added/datacenters get expanded (whether they are a large company or small), I would expect an increase in prices mostly

This year, companies actually still took the cost but didn't want the market to panic so some black friday deals were good but I am not so sure about the next year or the next next year.

This will be a problem in my opinion for the next 1-3 or 4 years in my estimate

Also AWS is really on the more expensive side of things in the datacenters and they are immensely profitable so they can foot the bill while other datacenters (small or semi large) cant

So we will probably see a shift of companies towards using AWS and big cloud providers(GCP,AWS,azure) a bit more when we take all things into account which saddens me even more because I appreciate open web and this might take a hit.

We already see resentment towards these tri-fecta but we will see even more resentment as more and more people realize their roles / the impacts they cause and just overall, its my intuition that average person mostly hate big tech.

It's going to be a weird year in my opinion for this type of business and what it means for the average person.

Honestly for the time being, I genuinely recommend hetzner,upcloud,(netcup/ovh) and some others that I know from my time researching. I think that they are cheaper than aws usually while still being large enough that you don't worry about things too much and there is always lowendtalk if one's interested. Hope it helps but trust me, there is still hope as I talked to these hosting providers on forums like lowendtalk and It might help to support those people too since long term, an open web is the ideal.

Here is my list right now: hetzner's good if you want support + basic systems like simple compute etc. and dont want too much excess stuff

OVH's good: if you want other things than just compute and want more but their support is something which is of a mixed bag

Upcloud's good: if you want both of these things but they are just a bit more expensive if one wants to get large VPS's than the other options.

Netcup's good: Their payment processing was really painful that I had to go through but I think that one can find use case for them (I myself use netcup but although that's because they had a really steal deal once but I am not sure if I would recommend it if there are no deals)

There are some other services like exe.dev that I really enjoy as well and these services actually inspire me to learn more about these things and there are some very lovely people working in these companies.

There is still hope though. So never forget that. Its just a matter of time in my opinion that things get back normal hopefully so I think I am willing to wait till then since that's all we can do basically but overall, yea its a bit sad when I think about it too :<

pgryko 4 hours ago | parent | prev [-]

We are now moving to a post human economy. When AGI automates all human labour, the consumer i.e. the bulk of humanity stops mattering (economically speaking). It then just becomes Mega corps run by machines making stuff for each other. Resources are then strictly priorities for the machines over everything else. We are seeing this movement already with silicon wafers and electricity.

blitzar 8 hours ago | parent | prev | next [-]

> If AI lives up to the hype, it's a portent of how things will feel to the common man.

This hype scenario would be the biggest bust of all for Ai. Without jobs or money then there is nobody to pay Ai to do all the things that it can do, it the power and compute it needs to function will be worth $0.

bratbag 8 hours ago | parent [-]

Or the value of everything non ai drops to zero, which makes the value of ai infinite by comparison.

numpad0 7 hours ago | parent [-]

Either ways it'll be the end of the USD as we know it. But then again such fantasy situations had been "predicted" numerous times and never once came to be a reality.

m_mueller 12 hours ago | parent | prev | next [-]

and if you're unlucky to live close to a datacenter, this could include energy and water? I sure hope regulators are waking up as free markets don't really seem equipped to deal with this kind of concentration of power.

walterbell 13 hours ago | parent | prev | next [-]

> rise to unaffordability

Or require non-price mechanisms of payment and social contract.

hackable_sand 9 hours ago | parent [-]

Yes. Like theft.

BoredPositron 11 hours ago | parent | prev | next [-]

AI probably will end up living up to the hype. It won't be on the generation of hardware they are now mass deploying. We need another tock before we can even start to talk about AGI.

nyc_data_geek1 8 hours ago | parent | prev [-]

Nothing ever lives up to the hype, that's why it's called hype.

testbjjl 12 hours ago | parent | prev | next [-]

> It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate)

Some might conclude the same for funds (hedge funds/private equity) and housing.

xadhominemx 9 hours ago | parent | next [-]

AI consumes about 30% of DRAM wafers. PE owns about .5% of single family homes.

impossiblefork 8 hours ago | parent [-]

It going to be misleading to look at the fraction and I think it's misleading to only look at PE investors. It's more important to look at the fraction of demand for homes that are on the market.

Investors bought 1/3 of the US homes sold in 2023. This is, I think, quite alarming, especially since a small amount of extra demand can have a large effect on prices.

xadhominemx 8 hours ago | parent [-]

That 1/3rd is almost all small time flippers who renovate properties before resale.

intrasight 8 hours ago | parent | next [-]

Good point. Flippers shouldn't be included in that stat. But I doubt that it's "almost all".

The statistic that matters is the ratio of owner occupied to rented single family homes.

riku_iki 5 hours ago | parent | prev [-]

this would need some source citation. There are plenty of investors on the market holding rental properties.

baobabKoodaa 11 hours ago | parent | prev | next [-]

I'm confused. When you say that hedge funds "price out" regular people, what do you mean? Price out of what?

kubb 10 hours ago | parent | next [-]

Of the housing market? That seems to be what GP said, doesn’t it?

gosub100 7 hours ago | parent | prev [-]

They create demand which increases price. Plus they can afford to hold their asset longer, this reducing supply.

lpapez 12 hours ago | parent | prev [-]

Stop right there you terrorist antifa leftie commie scum! You are being arrested for thought crime!

bloppe 2 hours ago | parent | prev | next [-]

It's called a shortage. Chips are highly cyclical. Right now demand is surging and prices and investment are booming. Give it 5 years and I'd bet many in the chips industry will be bemoaning a massive glut and oversupply that sends prices plummeting.

liveoneggs 6 hours ago | parent | prev | next [-]

They are willing to sacrifice everything in the short term to depress tech salaries long term. The purpose of this exercise is to make you poor.

jstanley 13 hours ago | parent | prev | next [-]

You don't think that as prices go up, the supply might also go up, and the equilibrium price will be maintained?

And possibly even a lower equilibrium will be reached due to greater economies of scale.

marcyb5st 12 hours ago | parent | next [-]

That assumes that supply production means can scale up instantly. Fabs for high end chips don't and usually take years from foundations being laid to First chip out of the production line.

In the interim, yeah, they will force prices up.

Additionally those fabs cost billions. Given the lead time I mentioned a lot of companies won't start building them right away since the risk of demand going away is high and the ROI in those cases might become unreachable

Imustaskforhelp 10 hours ago | parent [-]

I think one of these fab compaies have already invested/talked about investing 700 BILLION dollars (I think it was micron? but I am not sure)

I have heard that in the fab making industry, things moves in cycles and this cycle has been repeated so many times and the reason that ram is so expensive was that at one time during covid there was shortage so they built more factories and they built so so many that these companies took a hit in stocks so they then went and closed and at just the bottom of their factory production levels, the AI bubble started to pop in and need their ram's and now they are once again increasing factory levels

And after the supply due to AI gets closed with the additional compute etc., I doubt it

I think that within 2-3 or maybe 4 years of timeframe ram will get cheaper imo.

The problem is, if someone can fill the market till that time.

cubefox 5 hours ago | parent [-]

The article said a new RAM factory will open in 2027.

gehatare 12 hours ago | parent | prev | next [-]

Is there any reason to believe that this will happen? Prices of graphic cards only went down after the crypto boom went down again.

InsideOutSanta 10 hours ago | parent [-]

And they never went down to pre-crypto pricing. For quite a while, Intel was the only company producing reasonably specced GPUs at somewhat reasonable prices.

sph 10 hours ago | parent | prev [-]

> You don't think that as prices go up, the supply might also go up, and the equilibrium price will be maintained?

This assumes infinite and uniformly distributed resources as well as no artificial factors such as lobbying, corruption, taxation or legislation which might favour one entity over the other.

The dream of free market exists only in a vacuum free from the constraints of reality.

HexPhantom 10 hours ago | parent | prev | next [-]

I think the uncomfortable part is that it's not really about "AI hype" at this point, it's about who gets priority access to scarce inputs.

EZ-E 9 hours ago | parent | prev | next [-]

In the end, with the current market prices, chips factories and data centers are being built all over with the assumption of exponential demand growth. When the excitement and demand for AI cools, we will enjoy the additional capacity and better prices. Also see: fiber bandwidth post 2000. Capital poured in, overbuilding happened, prices collapsed after the crash.

doom2 9 hours ago | parent [-]

Has work begun on increasing RAM production capacity? My understanding is that these companies are specifically _not_ increasing capacity yet while they wait to see if the bubble bursts or not.

walterbell 8 hours ago | parent [-]

They decreased 2025 production, to increase memory prices, profits and their stock prices,https://news.ycombinator.com/item?id=46419776

Numerlor 8 hours ago | parent [-]

Afaik production of nand was reduced as some of the lines can be repurposed for dram that's more in demand.

Significantly increasing supply is also a huge multi year investment into a new fab that'd likely not pay out when the artificial demand breaks down.

riku_iki 5 hours ago | parent [-]

> Significantly increasing supply is also a huge multi year investment into a new fab

so, are there huge multi-year investments?

Numerlor an hour ago | parent [-]

There aren't because nobody is betting on ai demand to last. Then they'd have a couple billion dollar fab sitting around doing nothing and employees that'd have to be fired.

There already was scaling back for dram and and production post COVID, where I believe nand was being sold close to cost because of oversupply

tliltocatl 8 hours ago | parent | prev | next [-]

DUV processes are still a things and perfectly usable for general compute - but not for AI. Rising prices will make them competitive again. And it will require us to ditch Electron (which is a good thing). If anything, we might see a compute renaissance.

newsclues 10 hours ago | parent | prev | next [-]

First crypto now AI.

I just want to play video games so I don’t have to interact with people

rvnx 4 hours ago | parent | next [-]

This might be the case already.

https://en.wikipedia.org/wiki/Dead_Internet_theory

Even the multiplayer video games have bots...

Some of the users here are not real at all.

It's logical, if you want to push your product, be promoted on the first page of HN then you have to post fake comments using bots.

-> You get credibility and karma/trust for future submissions, and that's pretty much all you have to do.

Costs about 2 USD, can bring 2'000 USD in revenue, why wouldn't you want to "hustle" (like YC says) ?

Bots are here to grow, it will take time as for now the issue is still small, but you may already have interacted with bots, so do I.

disqard 10 hours ago | parent | prev [-]

Soon, you might have an AI partner, and never have to interact with people.

01HNNWZ0MV43FF 13 hours ago | parent | prev | next [-]

Without regulation, money begets money and monopolies will form.

If the American voter base doesn't pull its shit together and revive democracy, we're going to have a bad century. Yesterday I met a man who doesn't vote and I wanted to go ape-shit on him. "My vote doesn't matter". Vote for mayor. Vote for city council. Vote for our House members. Vote for State Senate. Vote for our two Senators.

"Voting doesn't matter, capitalism is doomed anyway" is a self-fulling prophecy and a fed psy-op from the right. I'm so fucking sick of that attitude from my allies.

lanyard-textile 11 hours ago | parent | next [-]

Jovially -- you simultaneously believe that they're a victim of a psy-op *and* that their attitude is self formed?

;) And you wanted to go ape shit on him... For falling for a psy-op?

My friend, morale is very very low. There is no vigor to fight for a better tomorrow in many people's hearts. Many are occupied with the problems of today. It doesn't take a psy-op to reach this level of hopelessness.

Be sick of it all you want, it doesn't change their minds. Perhaps you will find something more persuasive.

llmslave2 13 hours ago | parent | prev | next [-]

This is a common sentiment but it doesn't make any sense. Voting for the wrong politician is worse than not voting at all, so why is it seen as some moral necessity for everyone to vote? If someone doesn't have enough political knowledge to vote correctly, perhaps they shouldn't vote.

maeln 11 hours ago | parent | next [-]

Someone, I can't remember who, explained it better than me, but the gist of it is by not voting, you are effectively checking yourself out of politician consideration.

If we see politician as just a machine who's only job is to get elected, they have to get as many votes as possible. Pandering to the individual is unrealistic, so you usually target groups of people who share some common interest. As your aim is to get as many votes as possible, you will want to target the “bigger” (in amount of potential vote) groups. Then it is a game of trying to get the bigger groups which don't have conflicting interest. While this is theory and a simplification of reality, all decent political party do absolutely look at statistics and survey to for a strategy for the election.

If you are part of a group that, even though might be big in population, doesn't vote, politician have no reason to try to pander to you. As a concrete example, in a lot of “western” country right now, a lot of politician elected are almost completely ignoring the youth. Why ? Because in those same country the youth is the age group which vote the less.

So by not voting, you are making absolutely sure that your interest won't be defended. You can argue that once elected, you have no guarantee that the politician will actually defend your interest, or even do the opposite (as an example, soybean farmer and trump in the U.S). But then you won't be satisfied and possibly not vote for the same guy / party next election (which is what a lot of swing voters do).

But yeah, in an ideal world, everyone would vote, see through communication tactics and actually study the party, program and the candidate they vote for, before voting.

llmslave2 3 hours ago | parent [-]

I won't dispute there can be utility in voting, I just disagree with the moralizing.

In fact I think what you said about the older demographics being pandered to by politicians is a great point. Their voting patterns are probably having a net negative impact on society and really they should vote less. But they don't, and so politicians pander to them.

Capricorn2481 2 hours ago | parent | prev [-]

I don't have a stake in forcing people to vote or not, because I generally agree that uninformed people shouldn't be pressured to make a last minute decision if they don't want to. I think everyone knows elections are at their least honest days before the vote.

But to engage with your question, not voting is the same as voting. You are forgoing your voice and giving more weight to the people that do vote. It's limited to your district, yes, but whatever the outcome, you gave the majority power to do that. So it's not surprising that people get frustrated when non-voters see themselves as "outside" of politics, especially when they complain about the state of things.

voidfunc 2 hours ago | parent | prev | next [-]

Some of us dont vote because we just dont really think the outcomes matter.

As long as there is still a way to make money then nothing else really matters as money is the only thing that can buy you a semblance of happiness and freedom. Enough money and you can move to whatever country you want if things get bad enough too in the US.

layer8 11 hours ago | parent | prev | next [-]

The thing is that while voting matters collectively, it’s insignificant individually: https://en.wikipedia.org/wiki/Paradox_of_voting

Nonvoters aren’t being irrational.

irjustin 13 hours ago | parent | prev | next [-]

Just so we're clear the current voter base says this is exactly how it should be.

sph 10 hours ago | parent | next [-]

Just so we're double clear, the other voter base says this is exactly how it should be, using with different words.

All the ills of modern (American) politics stem by the blaming one side for the problems caused by both.

i80and 13 hours ago | parent | prev [-]

Just so we're clear, the voter base of over a year ago asked for this because they were actively lied to, and were foolish enough to believe said lies.

Current polling however says the current voter base is quite unhappy with how this is

nemomarx 12 hours ago | parent [-]

People spend a lot more effort and money lying to the voter base during election years than during the rest of the time.

kubb 10 hours ago | parent [-]

And the money required to change the voter’s minds is peanuts.

You don’t need to make them happy, just scared of the opposition.

Yizahi 11 hours ago | parent | prev | next [-]

It is very likely that his vote for the parliament literally and legally doesn't matter, depending on the party allegiance of the candidates and the state he is in. All because of the non-democratic ancient first past the post system. Though in his place I would go to the station and at least deface a ballot as a sign of contempt.

tirant 11 hours ago | parent | prev | next [-]

What regulation are you expecting to be passed and why do you believe monopolies are bad?

If a monopoly appears due to superior offerings, better pricing and quicker innovation, I fail to see why it needs to be a bad thing. They can be competed against and historically that has always been the case.

On the other hand, monopolies appearing due to regulations, permissions, patents, or any governmental support, are indeed terrible, as they cannot be competed against.

sph 10 hours ago | parent | prev [-]

> Without regulation, money begets money and monopolies will form.

Ahem, you'll find that with regulation, money begets money and monopolies will form. That is, unless you magically produce legislators which are incorruptible, have perfect knowledge and always make the perfect choice.

Even the Big Bang was imperfect, and matter clumps together instead of being perfectly distributed in the available space.

UltraSane 12 hours ago | parent | prev | next [-]

Supply should increase as a response to higher prices, this bringing prices down.

Ekaros 10 hours ago | parent | next [-]

Rational actors in game know that this demand spike is most likely temporary. So investing in more production only to face glut in future dropping margins to nearly nothing is not rational move.

This has been played out before so it is only natural that they are careful with increasing the supply. And while they don't response they are netting larger margins than before.

Obvious end result is that demand will drop as price goes up. The other natural part of supply-demand curve.

arnaudsm 11 hours ago | parent | prev | next [-]

That's economical theory, but the real world is often non-linear.

Crucial is dead. There's a finite amount of rare earth. Wars and floods can bankrupt industries, supply chains are tight.

tirant 11 hours ago | parent | next [-]

Those are all temporary events and circumstances.

If the market is big enough, competitors will appear. And if the margins are high enough, competitors can always price-compete down to capture market-share.

jsiepkes 10 hours ago | parent [-]

Competitors will appear? You can't build a DRAM production facility in a year. You probably even can't in two years.

Also, "price-compete down to capture market-share"? Prices are going up because all future production capacity has been sold. It makes no sense to lower prices if you don't have the capacity to full fill those orders.

scns 9 hours ago | parent | prev | next [-]

> Crucial is dead.

Micron stopped selling to consumers to focus on the high margin enteprise market. Might change in the future.

lotsofpulp 11 hours ago | parent | prev | next [-]

The business that owns crucial is producing more chips than ever.

Rare earth metals are in the dirt around the world.

Supply and demand curves shifting, hence prices increasing (and decreasing) is an expected part of life due to the inability to see the future.

mschuster91 11 hours ago | parent [-]

> Rare earth metals are in the dirt around the world.

They are. The problem is, the machinery to extract and refine them, and especially to make them into chips, takes years to build. We're looking at a time horizon of almost a decade if you include planning, permits and R&D.

And given that almost everyone but the AI bros expects the AI bubble to burst rather sooner than later (given that the interweb of funding and deals more resembles the Habsburg family tree than anything healthy) and the semiconductor industry is infamous for pretty toxic supply/demand boom-bust cycles, they are all preferring to err on the side of caution - particularly as we're not talking about single billion dollar amounts any more. TSMC Arizona is projected to cost 165 billion dollars [1] - other than the US government and cash-flush Apple, I don't even know anyone able, much less willing to finance such a project under the current conditions.

Apple at least can make use of TSMCs fab capacity when the AI bros go bust...

[1] https://www.tsmc.com/static/abouttsmcaz/index.htm

m4rtink 8 hours ago | parent [-]

Aren't rare earth metals used mainly for batteries, not chips ?

I guess people might be mixing up all the headlines of all the articles they did not read by this point.

mschuster91 6 hours ago | parent [-]

Chips also need rare doping materials, plus an absurd level of purity for the silicon. The problems are the same no matter if we're talking about chips or batteries.

bmacho 8 hours ago | parent | prev [-]

It's not even the economical theory. Supply should not increase to increased demand. They want more profit, and if less supplies is what accomplishes that, they will absolutely keep the supplies constant and manufacture a scarcity. This is the economical theory.

InsideOutSanta 11 hours ago | parent | prev | next [-]

As far as I can tell, none of the companies producing memory chips are increasing production because they don't know if the current demand is sustainable.

Increasing memory production capacity is a multi-year project, but in a few years, the LLM companies creating the current demand might all have run out of money. If demand craters just as supply increases, prices will drastically decrease, which none of these companies want.

xadhominemx 9 hours ago | parent [-]

You are wrong. Memory production is being expanded in 2026 and will expand further in 2027 and 2028 as the memory suppliers catch up on fab shell capacity.

abenga 11 hours ago | parent | prev | next [-]

How does this square with some companies just stopping sales to consumers altogether?

baq 11 hours ago | parent | next [-]

This is exactly it: supply of high margin products is increasing at the cost of low margin products. Expect the low end margin to catch up to the high end as long as manufacturing capacity is constrained (at least 1 year).

UltraSane 5 hours ago | parent | prev [-]

They aren't making less though.

63stack 9 hours ago | parent | prev | next [-]

The blessing the plebs with eternal wisdom comment.

Take a look at GPU prices and how that "supply increased thus bringing the prices down"

atq2119 11 hours ago | parent | prev | next [-]

As usual, the problem is: how fast does this happen?

dandanua 12 hours ago | parent | prev [-]

in a fairy world

YetAnotherNick 12 hours ago | parent [-]

No just stop being cynical. The reason almost every electronic item is cheaper now than 2 decades back is just becuase the demand(and thus supply) is higher.

coderenegade 10 hours ago | parent [-]

I can't tell if this is sarcasm or not. I'd argue it's more the result of the CCP bankrolling the Chinese electronics industry to the point where roughly 70% of all electronics goods are produced in China. The concentration of expertise and supply chains is staggering, and, imo, was really born out of geopolitical strategy.

YetAnotherNick 7 hours ago | parent | next [-]

No, its not. Transistors used to cost $1. Now they cost $1/billion or something. It's all because the 10s of billions of fixed cost incurred by fab is shared among customers. If we create less chips, the fixed cost wont reduce.

Ray20 6 hours ago | parent | prev [-]

> of the CCP bankrolling the Chinese electronics industry to the point where roughly 70% of all electronics goods are produced in China.

But we don't see this bankrolling in absolute values. Rather, it's due to regressive taxation, low (cheap) social security for workers, and very weak intellectual property protection.

arisAlexis 9 hours ago | parent | prev [-]

Calling the greatest and last invention of man "AI thingies" is telling of why our society will split into tech and non tech communities in the future like all the science fiction authors have predicted.

latexr 8 hours ago | parent [-]

> Calling the greatest and last invention of man

There are several inventions which are far greater than LLMs. To name two: computers and methods to generate electricity, things without which LLMs wouldn’t have been possible. But also harnessing fire, the wheel, agriculture, vaccines… The list goes on and on.

Calling LLMs “AI thingies” seems much more in tune with reality than calling them “the greatest invention of man” (and I’m steel manning and assuming you meant “latest”, not “last”). You can’t eat LLMs or live in them and they are extremely dependent on other inventions to barely function. They do not, in any way, deserve the title of “greatest invention”, and it’s worrying that we’re at that level of hyperbole. Though you’re certainly not the first one to make that claim.

https://finance.yahoo.com/news/alphabet-ceo-sundar-pichai-sa...

arisAlexis 7 hours ago | parent [-]

Am not talking about LLMs this is like making an argument about DC power. I am talking about inventing intelligence which is greater even than fire.

vee-kay 19 hours ago | parent | prev | next [-]

For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).

Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.

New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.

Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).

It is as if the industry has decided to focus on AI and nothing else.

And this will be a huge setback for humanity, especially the students and scientific communities.

koito17 12 hours ago | parent | next [-]

This is what I find a bit alarming, too. My M3 Max MacBook Pro takes 2 full seconds to boot Slack, a program that used to literally be an IRC client. Many people still believe client-side compute is cheap and worrying about it is premature optimization.

Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule.

Even as specs regress, I don't think most people in software will care about performance. In my experience, product managers never act on the occasional "[X part of an app] feels clunky" feedback from clients. I don't expect that to change in the near future.

Workaccount2 6 hours ago | parent [-]

Software has an unfortunate attribute (compared to hardware) where it's largely bound by what users will tolerate as opposed to what practically is possible.

Imagine Ford, upon the invention of push-button climate controls, just layered those buttons on top of the legacy sliders, using arms and actuators so pressing "Heat Up" moved an actuating arm that moved that underlying legacy "Heat" slider up. Then when touch screens came about, they just put a tablet over those buttons (which are already over the sliders), so selecting "Heat Up" fired a solenoid that pressed the "Heat Up" button that moved the arm to slide the "Heat Up" slider.

Ford, or anyone else doing hardware, would never implement this or it's analog, for a long obvious list of reasons.

But in software? That's just Thursday. Hence software has seemed stuck in time for 30 years while processing speed has done 10,000x. No need to redesign the whole system, just type out a few lines of "actuating arm" code.

SXX 16 hours ago | parent | prev | next [-]

No dedicated GPU is certainly unrelated to whatever been happening for last two years.

It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.

And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.

amiga-workbench 16 hours ago | parent | next [-]

A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable.

vee-kay 8 hours ago | parent | next [-]

Dedicated GPUs in gaming laptops are a necessity for the IT industry, as it forces manufacturers, assemblers and software makers to be more creative and ambitious with power draw and graphics software, and better optimal usage of available hardware resources (e.g., better battery and different performance modes to compensate for the higher power consumption due to the GPU; so a low-power mode enabled by casual user will disable the dedicated GPU and make the OS and apps dependent on the integrated GPU instead, but same/another user using same PC can switch to dedicated GPU when playing a game or doing VFX or modeling).

Without dedicated GPUs, we consumers will get only weaker hardware, slower software and the slow death of graphics software market. See the fate of Chromebooks market segment - it is almost dead, and ChromeOS itself got abandoned.

Meanwhile, the same Google which made ChromeOS as a fresh alternative OS to Windows, Mac and Linux, is trying to gobble the AI market. And the AI race is on.

And the result of all this AI focus and veering away from dedicated GPUs (even by market leader nVidia, which is no longer having GPUs as a priority) is not only the skyrocketing price hikes in hardware components, but also other side effects. e.g., new laptops are being launched with NPUs which are good for AI but bad for gaming and VFX/CAD-CAM work, yet they cost a bomb, and the result is that budget laptop market segment has suffered - new budget laptops have just 8GB RAM, 250GB/500GB SSD, and poor CPU, and such weak hardware, so even basic software (MS Office) struggles on such laptops. And yet even such poor laptops are having a higher cost these days. This kind of deliberate market crippling affects hundreds of millions of students and middle class customers who need affordable yet decent performance PCs.

iancmceachern 13 hours ago | parent | prev | next [-]

For me it's a necessity to run the software I need to do my work (CAD design)

silon42 11 hours ago | parent | prev | next [-]

Same here... I do not wish for a laptop with >65W USB-C power requirements.

trinsic2 14 hours ago | parent | prev [-]

Yea I agree it's not worth it to have a igpu a dedicated. If I'm correct in what you are talking about. There's always issues with that setup in laptops. But I'd stay away from all laptops at this point until we get an Adminstration that enforces anti trust. All manufactures have been cutting so many corners, your likely to have hardware problems within a year unless it's a MacBook or a business class laptop.

trinsic2 14 hours ago | parent | prev | next [-]

Yea mid-tier is a stretch. Maybe low-end gaming

SXX 12 hours ago | parent [-]

Low end gaming ia 2d indie titles and they now run on toasters.

All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine.

You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end.

Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres.

ChoGGi 7 hours ago | parent [-]

You can play doom the dark ages on steam deck. Granted at 30 fps, but it's still doom with ray tracing.

Fabricio20 14 hours ago | parent | prev [-]

I'm gonna be honest thats not my experience at all. I got a laptop with a modern ryzen 5 CPU four years ago that had an iGPU because "its good enough for even mid-tier gaming!" and it was so bad that I couldn't play 1440p on youtube without it skipping frames. Tried parsec to my desktop PC and it was failing that as well. I returned it and bought a laptop with a nvidia dGPU (low end still, I think it was like a 1050-refresh-refresh equivalent) and haven't had any of those problems. That AMD Vega gpu just couldn't do it.

copx 11 hours ago | parent | next [-]

No Ryzen 5 system should have any trouble playing YouTube videos, there must have been something wrong with your system.

rabf 7 hours ago | parent | prev | next [-]

Thats a problem with youtube not your gpu!

Iulioh 14 hours ago | parent | prev [-]

What processor are we talking about?

Your experience is extremely weird

999900000999 15 hours ago | parent | prev | next [-]

Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use.

The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.

On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.

This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.

For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.

All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.

We don't *need* more ram. We need better software.

walterbell 15 hours ago | parent | next [-]

> hopes this pushes Microsoft to at least create a low ram mode

Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the "Windows + Devices" division.

> We don't *need* more ram

RAM and SSDs both use memory wafers and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.

Nvidia is re-inventing Optane for AI storage with higher IOPS, and paid $20B for Groq LPUs using SRAM for high memory bandwidth.

The architectural road ahead has tiers of memory, storage and high-speed networking, which could benefit AI & many other workloads. How will industry use the "peace dividend" of the AI wars? https://www.forbes.com/sites/robtoews/2020/08/30/the-peace-d...

  The rapid growth of the mobile market in the late 2000s and early 2010s led to a burst of technological progress..  core technologies like GPS, cameras, microprocessors, batteries, sensors and memory became dramatically cheaper, smaller and better-performing.. This wave of innovation has had tremendous second-order impacts on the economy. Over the past decade, these technologies have spilled over from the smartphone market to transform industries from satellites to wearables, from drones to electric vehicles.
PunchyHamster 13 hours ago | parent [-]

> RAM and SSDs both use NAND flash and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.

Why on earth you think RAM uses NAND flash ?

walterbell 13 hours ago | parent [-]

Sorry, still editing long comment, s/NAND flash/memory wafers/.

citrin_ru 6 hours ago | parent | prev | next [-]

> Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use

Browsing web requires more and more RAM each year but I don't think browsers are the main reason - sites use more and more JS code. With a hard cap many sites will stop working. Software bloat is a natural tendency, the path of least resistance. Trimming weigh requires a significant effort and in case of web - a coordinated effort. I don't believe it could happen unless Google (having a browser with >60% market share) will force this but Google own sites are among worst offenders in term of hardware requirements.

tazjin 10 hours ago | parent | prev | next [-]

> Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use.

You can already do this. For example, I use `systemd-run` to run browsers with CPU quotas applied. Firefox gets 400% CPU (i.e. up to 4 cores), and no more.

Example command: systemd-run --user --scope -p CPUQuota=400% firefox

vee-kay 8 hours ago | parent [-]

You can impose CPU restrictions in Windows 10 or 11 too...

You can limit CPU usage for a program in Windows by adjusting the "Maximum processor state" in the power options to a lower percentage, such as 80%. Additionally, you can set the program's CPU affinity in Task Manager. Please note this will only affect the process scheduling.

You can also use a free tool like Process Lasso or BES to limit the CPU for a Windows application. You can use a free tools like HWInfo, SysInternals (ProcMon, SysMom, ProcDump) to monitor and check for CPU usage, especially to investigate CPU spikes caused by rogue (malware or poor performance) apps.

cellular 6 hours ago | parent [-]

CPU affinity? I haven't been able to change priority in task manager since window 8 i think. Cpu affinity seems only to allow which cores get assigned...not really good management.

vee-kay 6 hours ago | parent [-]

Process Lasso worked for me few years back when I needed to do CPU cores restriction for an old program.

viccis 15 hours ago | parent | prev | next [-]

Yeah I'm sure that will happen, just like prices will go back down when the stupid tariffs are gone.

MangoToupe 13 hours ago | parent | prev | next [-]

> I was only using 6GBs of ram.

Insane that this is seen as "better software". I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but

999900000999 an hour ago | parent | next [-]

Let's not let perfect be the enemy of good.

I was being lazy, but optimized I guess I could get down to 4GB of ram.

kasabali 12 hours ago | parent | prev [-]

More like 128MB.

512MB in 2000 was like HEDT level (though I'm not sure that acronym existed back then)

anthk 11 hours ago | parent | next [-]

512MB weren't that odd for multimedia from 2002, barely a few years later. By 2002 256MB of RAM were the standard, almost a new low-end PC.

64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm/wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine.

128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes will fly with wmaker/icewm/fvwm/blackbox and the like. Good enough for mozilla.

192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds.

256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it's mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3 flied and they ran much faster than even Win98 with 192MB of RAM.

vee-kay 4 hours ago | parent [-]

Win10 can work with 8GB DDR4 RAM.

Win11, on the other hand, meh..

Though Win10 will stop getting updates, but M$ is mistaken if it thinks it can force customers to switch to more expensive, buggy, bad performance Win11.

That's why I switched to Linux for my old PC (a cute little Sony Viao), though it worked well with Win10. Especially after I upgraded it to an 1TB SATA SSD (since even old SATA1.0 socket works with newer SATA SSDs, as SATA interface is backward compatible; it felt awesome to see a new SSD work perfectly in a 15years old laptop), some additional RAM (24GB (8+16) - 16GB repurposed from another PC), and a new battery (from Amazon - it was simply plug and play - simply eject the old battery from its slot and plug in the new battery).

I find it refreshing to see how easy it was to upgrade old PCs, I think manufacturers are deliberately making it harder to repair devices, especially mobile phones. That's why EU and India were forced to mandate the Right to Repair.

MangoToupe 7 hours ago | parent | prev [-]

You're right. I thought I was misremembering....

dottjt 15 hours ago | parent | prev [-]

That's not happening though, hence why we need more ram.

ssl-3 14 hours ago | parent [-]

Eh? As I see it, we've got options.

Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required

Option B: We wish that things were different, such that additional RAM were a viable option like it has been at many times in the past.

Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly)

---

To evaluate these in no particular order:

Option B doesn't sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn't usually get very far.

Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn't sell them. (Maybe someone else has a working one, but they seem to be pretty rare here.)

That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that "Internet" thing that the cool kids were talking about back in the 1980s.

rabf 6 hours ago | parent [-]

There is plenty of well optimised software out there already, hopefully a ram shortage can encourage people to seek it out. Would be nice if there were some well curated lists of apps. Sort of like suckless but perhaps a little less extreme. A long standing problem in the software industry is developers havein insanely overspecced machines and fat interne popes leading to performance issues going unnoticed by the people that should be fixing them. The claim that they the need that power to run their their code editor and compiler is really only a need for a code editors and compilers that suck less. I've always ran a 10 year old machine (I'm cheap) and had the expectation that my debug builds run acceptably fast!

zahlman 18 hours ago | parent | prev | next [-]

I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.

Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)

Groxx 16 hours ago | parent | next [-]

I've been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I'm not doing something obviously taxing.

Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.

callc 17 hours ago | parent | prev | next [-]

Mint is probably around 0.05% of desktop/laptop users.

I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.

OGEnthusiast 15 hours ago | parent [-]

> I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).

chii 15 hours ago | parent [-]

The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it's flattening.

cons0le 6 hours ago | parent | prev | next [-]

This is a text based website though. It should be fast on everything. Most websites are a bloated mess, and a lot slower

vee-kay 4 hours ago | parent | prev [-]

Try Win11 on that old PC, and you'll really feel the need for more RAM and a better CPU.

I sometimes feel M$ is deliberately making its Windows OS clunkier, so it can turn into a SaaS offering with a pricey subscription, like it has already successfully done with its MS-Office suite (Office 365 is the norm in corporates these days, though individuals have to shell out $100 per year for MS Office 365 Personal edition). We can still buy MS Office 2024 as standalone editions, but they are not cheap, because Micro$oft knows the alternatives on the market aren't good enough to be a serious threat.

chii 15 hours ago | parent | prev | next [-]

> Industry mandate should have become 16GB RAM for PCs

it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.

We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.

GuB-42 12 hours ago | parent | prev | next [-]

I think it is the result of specs like RAM and CPU no longer being the selling point it once was, except for gaming PCs. Instead people want thin laptops, good battery life, nice screens, premium materials, etc... We have got to the point where RAM and CPU are no longer a limiting factor for most tasks, or at least until software become bloated enough to matter again.

If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won't be great, but you can't be cheap without making compromises.

Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need.

mmsimanga 9 hours ago | parent | prev | next [-]

It is even worse for those of us in Africa. The equivalent phone I can buy for USD$150-250 back home is absolutely shocking in terms of how bad it is in ram ,disk space and often an outdated version of Android. I buy my phones on Amazon US which delivers and I get a much better phone for the same price range.

lysace 9 hours ago | parent [-]

In that price range paid bundling of pre-installed apps become a major economi driver. Those installs are worth a lot more in the US.

mmsimanga 6 hours ago | parent [-]

Ah that makes sense. Thanks for the insight.

linguae 19 hours ago | parent | prev | next [-]

I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.

The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?

Saris 19 hours ago | parent [-]

Run a lightweight Linux distro on older hardware maybe?

ta9000 19 hours ago | parent | next [-]

This is it. Buy used Dell and HP hardware with 32 GB of RAM and swap the pcie ssd for 4 TB.

dartharva 10 hours ago | parent [-]

No, this is not it. It only worked when there were a small number of buyers for used hardware, who were largely enthusiasts. The moment it becomes mainstream you're going to face the same scarcity in the used/refurbished market as well.

intrasight 8 hours ago | parent [-]

There are lots of such computers to be repurposed. It'll relieve price pressure and avoid e-waste.

thatguy0900 16 hours ago | parent | prev [-]

Exclusively using a ever dwindling stock of old hardware is not really a practical solution to preserving hardware rights in the long term

jjgreen 9 hours ago | parent [-]

The future is ragged shoeless and grimy humans fighting over the last few 1990's pocket calculators.

HexPhantom 10 hours ago | parent | prev | next [-]

I think a lot of this rings true, but what makes it especially frustrating is that none of it feels technically necessary

cons0le 6 hours ago | parent | prev | next [-]

>It is as if the industry has decided to focus on AI and nothing else.

I mean, isn't this exactly what happened? I could be wrong about this, but didn't ycombinator themselves say they weren't accepting any ideas that didn't include AI?

80 percent of the jobs people reach out to me for are shady AI jobs that I have no interest in. Hiring in other non-AI jobs seems to have slowed.

When I talk to "computer" people, they only want to talk about AI. I wish I could quit computers and never look at a screen again.

deadbabe 16 hours ago | parent | prev [-]

I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason.

rurp 16 hours ago | parent | next [-]

The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them.

We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.

anigbrowl 15 hours ago | parent | next [-]

Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled.

kasabali 12 hours ago | parent | next [-]

We're not talking about growing tomatoes in your backyard.

re-thc 14 hours ago | parent | prev [-]

> Have the laws of supply and demand been suspended?

There is the law of uncertainty override it eg trade wars, tariffs , etc.

No 1 is going all in with new capacity.

charcircuit 15 hours ago | parent | prev | next [-]

If "performant" devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled.

Apps are optimized for the install base, not for the engineer's own hardware.

DeepSeaTortoise 14 hours ago | parent [-]

What is the point of telemetry if your IDE launching in under 10s is considered the pinnacle of optimization?

That's like 100B+ instructions on a single core of your average superscalar CPU.

I can't wait for maps loading times being measured in percentage of trip time.

charcircuit 12 hours ago | parent | next [-]

Because you don't want to regress any of the substeps of such a loading progress to turn it back into 10+ seconds of loading.

deadbabe 9 hours ago | parent | prev [-]

If your IDE isn’t launching instantly you have a bad IDE.

DeepSeaTortoise 6 hours ago | parent [-]

I guess so, but:

https://youtu.be/qqUgl6pFx8Q?si=x3CpsW9Aane7GHHV&t=1875

hansvm 16 hours ago | parent | prev | next [-]

Can confirm, I'm currently requesting as much RAM as can fit in the chassis and permission to install an OS not too divorced from what we run in prod.

On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.

incompatible 16 hours ago | parent | prev [-]

How many "great products and services" even need a lot of RAM, assuming that we can live without graphics-intensive games?

TheDong 15 hours ago | parent | next [-]

Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory.

If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.

deadbabe 9 hours ago | parent [-]

IRC is far superior than Slack when it comes to RAM usage. Projects should just switch to that.

rhdunn 13 hours ago | parent | prev [-]

Image, video, and music editing. Developing, running, and debugging large applications.

Ekaros 11 hours ago | parent [-]

The last three sounds to me like self-inflicted issues. If applications weren't so large, wouldn't less resources be needed?

Insanity 16 hours ago | parent | prev | next [-]

I'm not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa).

chii 14 hours ago | parent [-]

> a significant part of the world is already running lower powered devices

but you cannot consider this in isolation.

The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.

If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.

Insanity 6 hours ago | parent [-]

In general what you are saying makes sense. But there are specific counter examples, such as Crysis in 2008 or CyberPunk 2077 some years ago.

Both didn’t run great on the “average consumer hardware”.

But I’ll admit this is cherry picking from my side :)

trinsic2 14 hours ago | parent | prev | next [-]

I don't think it's going to happen in this day and age. Some smart people will but most barley know how to write there own code let alone write efficient code

TheDong 16 hours ago | parent | prev | next [-]

At the same time, AI has made it easier than ever to produce inefficient code, so I expect to rather see an explosion of less efficient software.

rabf 5 hours ago | parent [-]

It has also made it easier than ever to build native applications that use the os provided toolkits and avoid adding a complete web-tech stack to everything.

laterium 15 hours ago | parent | prev | next [-]

Why are you celebrating compute becoming more expensive? Do you actually think it will be good?

thatguy0900 16 hours ago | parent | prev | next [-]

I think the actual outcome is they will expect you to rent servers to conduct all your computing on and your phone and pc will be a dumb terminal.

jghn 15 hours ago | parent | prev [-]

Good luck with that.

rainsford 4 hours ago | parent | prev | next [-]

One aspect of this I don't see mentioned all that often is that AI is competing for computing resources that are at least somewhat limited in the short to medium term while being pretty inefficient at utilizing those resources compared to alternatives.

A medium end gaming PC can display impressively realistic graphics at high resolutions and framerates while also being useful for a variety of other computationally intensive processing tasks like video encoding, compiling large code bases, etc. Or it can be used to host deeply mediocre local LLMs.

The actual frontier models from companies like Anthropic or OpenAI require vastly more expensive computing resources, resources that could otherwise be used for potentially more useful computation that isn't so inefficient. Think of all the computing power going into frontier models but applied to weather forecasting or cancer research or whatever.

Of course it's not either or, but as this article and similar ones point out, chips and other computing resources aren't infinite and AI for now at least has a seemingly insatiable appetite and enough dollars to starve other uses.

vee-kay 3 hours ago | parent | next [-]

(Sharing a comment I recently posted on a similar thread..)

For last 2+ years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs). Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.

New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.

Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).

It is as if the industry has decided to focus on AI and nothing else.

And this will be a huge setback for humanity, especially the students and scientific communities.

pureagave 3 hours ago | parent [-]

I don't think we need "industry mandate". I suspect the electronics specs are dictated by the very efficient market and the consumer is being squeezed in many ways. So device manufactures are just meeting the pricing needs of the consumer and dropping the expensive things that are less understood like GPUs and extra ram.

walterbell 3 hours ago | parent | next [-]

> dropping the expensive things that are less understood

If "old" devices outperform new devices, consumers will gain new understanding from efficient market feedback, influencing purchase decisions and demand for "new" devices.

vee-kay an hour ago | parent | prev [-]

RAMageddon is here.. https://www.tomsguide.com/news/live/ram-price-crisis-updates

Summary: * Massive spikes: Consumer RAM prices have skyrocketed due to a tight supply. Major PC companies have issued warnings of price hikes, with CyberPowerPC stating: "global memory (RAM) prices have surged by 500% and SSD prices have risen by 100%."

* All for AI: The push for increased cloud computing, as seen in the likes of ChatGPT and Gemini, means more data centers are needed, which in turn requires High Bandwidth Memory (HBM). Manufacturers like SK Hynix and Micron are now shifting priorities to make HBM instead of PC RAM.

* Limited supply: Companies are now buying up stock of all the remaining supply of standard DRAM chips, leaving crumbs for the consumer market and price hikes for the limited supply there is.

Good luck expecting "value for money" from this "efficient" market.

BobbyJo 3 hours ago | parent | prev [-]

Define "efficient" in this context.

The point of the gold rush now is that a large number of investors think AI will be more efficient at converting GPU and RAM cycles into money than games or other applications will. Hence they are willing to pay more for the same hardware.

SunlitCat 10 hours ago | parent | prev | next [-]

I really wonder when the point will be reached at which the South Korean government steps in and starts to take a closer look at the growing long-term supply commitments that companies like OpenAI are indirectly driving with major memory manufacturers such as SK hynix and Samsung Electronics.

Allocating a very large share of advanced memory production, especially HBM and high-end DRAM, which are critical for almost all modern technology (and even many non-tech products like household appliances) to a small number of U.S. centric AI players risks distorting the global market and limiting availability for other industries.

Even within Samsung itself, the Mobile eXperience (MX) Business (smartphones) is not guaranteed preferential access to memory from Samsung’s Device Solutions (DS) Division, which includes the Memory Business. If internal customers are forced to source DRAM elsewhere due to pricing or capacity constraints, this could eventually become economically problematic for a country that relies very heavily on semiconductor and technology exports.

HexPhantom 10 hours ago | parent | next [-]

If Samsung's own MX division can't count on predictable access because internal transfer pricing loses to hyperscaler demand, that's a red flag

nubinetwork 9 hours ago | parent | prev [-]

> South Korean government

It's not like it's their fault that micron pulled out of the market...

Edit: maybe someone should consider sweet-talking kioxia into making dram chips?

Neywiny 6 hours ago | parent [-]

Micron isn't pulling out of the market. They discounted Crucial. Those are very different things. They pulled out of the direct to consumer market, not DRAM

goku12 11 hours ago | parent | prev | next [-]

I understand the issue with all the devices. But what about the rest of the things that depend on these electronics, especially DRAMs? Automotive, Aircraft, Marine vessels, ATC, Shipping coordination, traffic signalling, rail signalling, industrial control systems, public utility (power, water, sewage, etc) control systems, transmission grid control systems, HVAC and environment control systems, weather monitoring networks, disaster altering and management systems, ticketing systems, e-commerce backbones, scheduling and rostering systems, network backbones, entertainment media distribution systems, defense systems, and I don't know what else. Don't they all require DRAMs? What will happen to all of them?

synack 11 hours ago | parent | next [-]

Industrial microcontrollers and power electronics use older process nodes, mostly >=45nm. These customers aren’t competing for wafers from the same fabs as bleeding edge memory and TPUs.

The world ran just fine on DDR3 for a long time.

goku12 11 hours ago | parent | next [-]

Okay, but what about the rest? The ones that aren't embedded in someway and use industrial grade PCs/control stations? Or ones with large buffers like network routers? I'm also wondering about the supply of the alternate nodes and older technologies. Will the manufactures keep those lines running? Was it micron that abandoned the entire retail market in favor of supplying the hyperscalers?

Imustaskforhelp 10 hours ago | parent [-]

> The ones that aren't embedded in someway and use industrial grade PCs/control stations? Or ones with large buffers like network routers?

Not sure if they require DDR5 but the AI crisis just caused the prices of DDR5 to rise but the market supply of DDR4 thus grew and that's why they got more expensive too

> I'm also wondering about the supply of the alternate nodes and older technologies.

I suppose these might be chinese companies but there might be some european/american companies (not sure) but if things continue, there is gonna be a strain on them in demand and they might increase their prices too

> Was it micron that abandoned the entire retail market in favor of supplying the hyperscalers?

Yes

dartharva 10 hours ago | parent | prev [-]

..DDR3 that's no longer being produced. Why do people just assume old tech to be abundant in supply?

lysace 11 hours ago | parent | prev [-]

A $100k EV has roughly the same amount of DRAM as a $1k phone.

The EV is a therefore, on a whole, a lot less sensitive to DRAM price increases.

tirant 10 hours ago | parent | next [-]

That is factually wrong.

That might be the case only for the infotainment system, but there’s usually many other ECUs in an EV. The ADAS ECUs are carrying similar amounts as an iPhone or the infotainment system. Telematics is also usually also a relatively complex one, but more towards lower sized amounts.

Then you have around 3-5 other midsized ECUs with relatively high memory sizes, or at least enough to require MMUs and to run more complex operating systems supporting typical AUTOSAR stacks.

And then you have all the small size ECUs controlling all small individual actuators.

But also all complex sensors like radars, cameras, lidars carry some amounts of relevant memory.

I still think your point is valid, though. There’s no difference in orders of magnitude when it comes to expensive RAM compared to an iPhone. But there’s cars also carried lots of low-speed, automotive grade memory in all the ECUs distributed throughout the vehicle.

lysace 9 hours ago | parent [-]

So how many GB in total for a Tesla, or say a VW EV? Something like 16-32 GB? Is that not roughly like a $1k phone?

tirant 4 hours ago | parent | next [-]

I cannot say, depends on the vehicle. But easily 50-64GB range.

If there’s three main ECUs at 16GB each, you’re already hitting 50GB. Add 2-4GB for mid size ecus, and anything in between KBs and some MB for small ECUs.

danaris 7 hours ago | parent | prev [-]

For reference, the iPhone 17 Pro (which starts at $1100) has 12GB of RAM.

goku12 11 hours ago | parent | prev [-]

Okay, accepted. But are you sure that the supply won't be a problem as well? I mean, even if these products choose a different process nodes compared to the hyperscalers, will the DRAM manufactures even keep those nodes running in favor of these industries?

Imustaskforhelp 10 hours ago | parent [-]

What will probably happen is that the reselling market/2nd market of these might probably rise

> will the DRAM manufactures even keep those nodes running in favor of these industries?

Some will, Some might not, In my opinion, the longevity of these brands will only depend if they allow buying ram for the average person/consumer brands so I guess we might see new competition perhaps or give more marketshare to all the other fab companies beyond the main three of these industries.

I am sure that some company will 100% align with the consumers but the problem to me feels that they wouldn't be able to supply enough production to consumers in the first place so prices still might rise.

And those prices most likely will be paid by you in one form or another but it would be interesting to see how long the companies who buy dram from these providers or build datacenters or anything ram intensive will hold their price up, perhaps they might eat the loss short term similar to what we saw some companies do during trump tarrifs.

dizlexic 5 hours ago | parent | prev | next [-]

Alarmist silliness. If it's a bubble prices will drop. If it isn't production will adapt.

polski-g 4 hours ago | parent [-]

People love to panic for something that will be fixed in 24 months.

zdc1 12 hours ago | parent | prev | next [-]

Thankfully we're at a stage where a 4 year old second-hand iPhone is perfectly usable, as are any M-series Macs or most Linux laptops. Sucks for anyone needing something particularly beefy for work; but I feel that a lot of purchases can be delayed for at least a year or two while this plays out.

Imustaskforhelp 10 hours ago | parent [-]

> Sucks for anyone needing something particularly beefy for work; but I feel that a lot of purchases can be delayed for at least a year or two while this plays out.

100%, there is a lost time factor which sucks though if someone wants beefy, some might be forced to cough up some money but honestly yea although I agree with your statement, the ram increases really impact the self hosting/homelabbing genres and there was a recent post relevant to this discussion on hackernews https://news.ycombinator.com/item?id=46416618: Self hosting is being enshittified.

eb0la 8 hours ago | parent | prev | next [-]

I believe we're in the middle of a "perfect storm" and AI is not the only one to blame:

- Right now A LOT of PCs are getting out of date because Windows 11 wants some new hardware that's missing in older PCs. - At the same time, smartphones were starting to use and need more memory. - Modern cars (mostly electric) have much more electronics inside than older ones... they're now distributed systems with several CPUs working together along a bus. - The cloud needs to upgrade and newer servers have much more memory than older ones with almost the same foorprint (wich means you need to lease less datacenter space). - And GPUs for AI are in demand and need RAM.

But only AI is to blame although we're living in a perfect storm for RAM demand.

liveoneggs 6 hours ago | parent [-]

bloat and crap software coming home to roost

jazzyjackson 20 hours ago | parent | prev | next [-]

Question: are SoCs with on die memory be effected by this?

Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios

Are snapdragon chips the same way?

nrp 19 hours ago | parent | next [-]

We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.

Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.

mips_avatar 15 hours ago | parent [-]

How do suppliers communicate these changes? Are they just like yep now it’s 3x higher? Im surprised you don’t have longer contracts

appellations 15 hours ago | parent | next [-]

Longer contracts are riskier. The benefit of having cheaper RAM when prices spike is not strong enough to outweigh the downside of paying too much for RAM when prices drop or stay the same. If you’re paying a perpetual premium on the spot price to hedge, then your competitors will have pricing power over you and will slowly drive you out of the market. The payoff when the market turns in your favor just won’t be big enough and you might not survive as a business long enough to see it. There’s also counterparty risk, if you hit a big enough jackpot your upside is capped by what would make the supplier insolvent.

All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.

In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.

They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.

baq 11 hours ago | parent | prev | next [-]

If you’re Apple, maybe that works, in this case we’re seeing 400% increases in price, instead of your RAM you’ll be delivered a note to pay up or you’ll get your money back with interest and termination fees and the supplier is still net positive.

chii 14 hours ago | parent | prev [-]

> longer contracts

the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).

addaon 20 hours ago | parent | prev | next [-]

> Question: are SoCs with on die memory be effected by this?

SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.

pixelpoet 20 hours ago | parent | next [-]

The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.

To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.

plagiarist 16 hours ago | parent [-]

Are your two chips in Framework Desktops, or some other package? I'm interested in a unified memory setup and curious about the options.

zahlman 18 hours ago | parent | prev | next [-]

https://en.wiktionary.org/wiki/die#Noun

"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".

layer8 11 hours ago | parent | prev [-]

*affected

piskov 20 hours ago | parent | prev | next [-]

Apple secured at least a year-worth supply of memory (not in actual chips but in prices).

The bigger the company = longer the contract.

However it will eventually catch up even to Apple.

It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu

mirsadm 12 hours ago | parent | next [-]

Apple charges so much for RAM upgrades that they could probably not even increase prices and still be fine. They won't but they probably could.

layer8 11 hours ago | parent [-]

At the cost of reduced margins, which shareholders may not like.

greesil 15 hours ago | parent | prev | next [-]

Apparently Google fucked up

https://www.google.com/amp/s/www.indiatoday.in/amp/technolog...

magicalhippo 13 hours ago | parent [-]

Non-AMP link:

https://www.indiatoday.in/technology/news/story/ram-shortage...

SunlitCat 11 hours ago | parent [-]

To be honest, it starts to look more and more like a single company (we all know which one), is just buying up all DRAM capacities to keep others out of the (AI) game.

greesil 7 hours ago | parent [-]

Diabolical

trollbridge 19 hours ago | parent | prev [-]

I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there's basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.

saagarjha 19 hours ago | parent [-]

Yes, but OpenAI wants $200 billion in RAM and Apple wants $10.

dehrmann 16 hours ago | parent | prev [-]

I would think so because fab capacity is constrained, and if you make an on-die SoC with less memory, it uses fewer transistors, so you can fit more on a wafer.

hvb2 11 hours ago | parent [-]

But bigger chips mean lower yields because there's just more room for errors?

racl101 an hour ago | parent | prev | next [-]

I should probably buy a spare Logitech keyboard and mouse pair. Mine current ones are seeing wear and tear.

nospice an hour ago | parent [-]

FWIW, these markets are largely separate. Keyboards and mice typically use relatively simple microcontrollers made at "yesteryear" node sizes (100 nm+), so they can't start making high-density DRAM or GPUs even if they wanted to.

motbus3 10 hours ago | parent | prev | next [-]

For me, there is concerning a flag about all of this.

I know this is not always true, but on this case, crucial folks say the margins for end user are too low and they have demand for AI.

I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...

There is a whole "not-exactly" ai industry labeled as AI that received a capital t of money. Is that what they are going for?

Imustaskforhelp 10 hours ago | parent | next [-]

So my understanding of the situation is that, Crucial folks had downsized their factory production (see my other comments for reference perhaps) but then AI involvement started demanding chips just at the time their factory production was at their lows.

So now for these AI companies, they got tons of money to burn so they are willing to pay a lot more, so now crucial only have a limited supply of ram and the thing is there isn't much difference between AI chip and consumer chip but the margins of AI chip are super higher compared to consumer chip

So earlier they would sell consumer chips and AI chips as well but then the AI companies still demanded even more and they would get insane profits selling them so what they did (atleast crucial) is that they stopped selling consumer chips just to sell AI chips for profit.

> I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...

Well as a consumer, I certainly hope so but I think that these companies did this case because their have been times they were -55% in stock prices and its just cash making money device at this point and there is a monopoly of fabs with just three key players.

So the answer to your question is "money" and "more money" short term. Their stock prices are already up I think and a company really loves short term rising stock prices

> They could raise the prices for the consumers if they are not worried about competition

Well, would you increase the prices 3-4x? Because supposedly thats how much the AI chips from what I've heard are... And due to this, the second hand market itself is selling these at a close-enough mark.

I don't know but I hope that new players come in the market, I didn't know that this ram industry was such monopolistic with there being only 3 key players and how that became a chokehold for the whole world economy in a way

motbus3 7 hours ago | parent [-]

> Well, would you increase the prices 3-4x? (Text below is quite long, let me say it here, you made great points in your answer!)

It seems they could. They not only single handed caused it to double or more without trying :/ Not sure if it would trigger other sorts of regulatory issues though

It is my impression that there is a fabricated scarcity of all goods. That's a common practice in cloth retailers. In the 90s they thought for brand name and market share. They noticed it was silly because they could sell half for double of the price and as this means less logistics, it also meant higher margins. It is not a lunch free approach. Selling less means that you delegate at least the bottom portion of your clients to the market, and if there are options, they might just be gone. That's exactly what happened with Chevrolet, Ford, etc. they stopped investing and when a new competitor appeared, even if it was more marketing than product, they lots rivers of money and barely can keep the fight on (except for maybe making a puppet tell others that there is no such thing as climate change, but that's something else)

Technology space right now looks like it. We already see major brands stagnation allegedly because they did all that is possible and it will take some time until some nouvelle approach to appear.

As a consumer, I want to believe this won't take long to settle but I'm afraid money is going elsewhere

HexPhantom 10 hours ago | parent | prev [-]

Building a dedicated AI-focused consumer line is risky: long development cycles, uncertain demand, and the chance that today's hype cools before the product ships

Ekaros 13 hours ago | parent | prev | next [-]

Outside say video and image editing and maybe lossless audio. Why is this much ram even needed in most use cases? And I mean actually thinking about using it. Computer code unless you are actually doing whole Linux kernel, is just text. So lot of projects probably would fit in cache. Maybe software companies should be billed for user's resources too...

vee-kay an hour ago | parent | next [-]

You must be surprised to learn that most of the personal/SOHO PC users use Windows as the default OS.

In fact, Microsoft and Intel made a cutthroat monopoly of the PC market by their long-term WinTel nexus (MS Windows optimized to run better on Intel CPUs, Intel CPU PCs being sold with MS WINDOW by default), until AMD upped the ante and stole the race by being first on the block with releasing x64/x32 bit processor so Microsoft chose to ditch Intel for AMD to usher in the new era of 54-bit Windows OSes.

AMD still dominates in server market and GPU market (where it has been innovating harder and giving better VFM than nVidia and Intel), so still struggling to dominate the PC market (PC assemblers/stores get better lucrative deals from Intel to sell Intel-based PCs, that's why we find fewer AMD-based PCs for sale in shops/stores.

And that doesn't bode well for PC users/customers. Because that WinTel+nVidia nexus will choose MS Windows over Linux any day.

As for why more RAM is needed, you must be again surprised to know that most people play video games on PCs and mobiles rather than expensive consoles.

But even casual gaming needs adequate RAM and some vRAM. Even heavy duty office work (e.g., opening/editing big Excel files or complex PDFs) is a problem in low-end PC. Engineering students and workers need to do complex CAD/CAM work on their PCs. Artists (including musicians) need to use powerful software tools to do design and art work. All these needs mandate more RAM (16GB at the minimum) because most of these tools need MS Windows (or alternatively, expensive Mac PCs, assuming MacOS has alternative apps to suit such needs).

After failing to beat AMDs versatility and VFM performance in the CPU & GPU market, nVidia and Intel have insteaf pivoted to AI to regain their stranglehold on the market. Their AI NPUs are dominating the PC market this year, but those new PCs are bad for the types of specific needs listed above.

This is also why Microsoft and its allies ensured that most video games are not ported to Linux (and Mac), until Valve finally started to change that status quo by focusing on Linux gaming (but out of self interest, as its money-maker Steam store became too heavily dependent on Microsoft for gaming).

So yeah, more RAM and better CPUs & GPUs please!

TrackerFF 8 hours ago | parent | prev | next [-]

Electron apps hog memory. The vast, vast majority of computer users are Windows users. Using 8 GB of memory without really "using" it, is trivial. Chrome + some Microsoft office apps will spend that much.

system2 11 hours ago | parent | prev [-]

I have multiple apps using 300 GB+ PostgreSQL databases. For some queries, high RAM is required. I enable symmetrical NVMe swaps, too. Average Joe with gaming needs wouldn't need more than 64 GB for a long time. But for the database, as the data grows, RAM requirements also grow. I doubt my situation is relatable to many.

Ekaros 11 hours ago | parent [-]

I understand servers. But why do actually average user need more than 2 or 4GB? For what actual data in memory at one time?

parrellel 6 hours ago | parent [-]

Where have you seen 4 GB cut it in the last decade? 2 GB was enough to make Vista chug in 2007?

I've got old linux boxes that feel fine with a couple gig of DDR3, but can't think of a place where that would be acceptable outside of that.

Ekaros 6 hours ago | parent [-]

My entire question is why can't whatever users do on computers actually work on 2GB of RAM? Like what is the true reason we are in state that it is for some reason not possible?

2 GB is huge amount information. So surely it should be enough for almost all normal users, but for some reason it is not.

vee-kay an hour ago | parent | next [-]

Quick.. list your favorite software and tell us how much GBs of space they use after installation and how many GBs of RAM they consume when running.

You will find most of your fave programs struggle badly with 2-4GB of RAM, even on Linux.

Over the years most software programs (even on mobile) have become bloated and slow due to "new features" (even if most people don't need them) and also because it is a nexus with the hardware manufacturers. Who will buy any expensive CPU, more RAM, larger capacity SSDs, bigger displays, etc., if there is no software program needing all that extra oomph of performance, bandwidth, and fidelity?

bloppe 4 hours ago | parent | prev | next [-]

One potential reason: now that CPU clock speed is plateauing, parallelism is the main way to juice performance. Many apps try to take advantage of it by running N processes for N cores. For instance, my 22-core machine will use all 22 cores in parallel by default for builds with modern build systems. That's compiling ~22 files at once, using ~5x as much RAM as the 4-core machines of 15 years ago, all else being equal. As parallelism increases further, expect your builds to use even more memory.

parrellel 5 hours ago | parent | prev [-]

Ah! Yes, I agree.

elthor89 13 hours ago | parent | prev | next [-]

If all manufacturers jump into serving the ai market segment.

Can this not be a opportunity for new entrants to start serving the other market segments?

How hard is it to start and manufacture memory for embedded systems in cars, or pc?

lexicality 12 hours ago | parent | next [-]

If it was easy there would be more memory manufacturers, rather than 2-3 wholesalers who sell to the people who put badges & rgb on it

Imustaskforhelp 10 hours ago | parent | prev | next [-]

One might think so but the AI companies actually lose a lot and I mean a lot of money in these deals.

Even if they might sell their inference and everything, they still wouldn't be that much profitable.

So like, the key point is that a ram company can supply openAI ram and get some really high quick bucks which would be even more than if they were to create thier own datacenters,run open source models in them, provide inference in say open router.

Now you might ask: Is openAI or these AI companies mad for burning so much money?

And I think you might know the answer to that.

vee-kay an hour ago | parent | prev | next [-]

Good luck fabricating new microchips. It is a very expensive and difficult proposition.

nice_byte 12 hours ago | parent | prev [-]

No, because if you have the capacity to make e.g. ram chips it makes more economic sense to sell them for ai bucks. Serving the other market segment is an opportunity cost unless you're selling to them at same prices. In the long run though if enough players emerge the price will eventually come down just due to oversupply.

SunlitCat 11 hours ago | parent [-]

Not quite. Making specialized DRAM chips for AI hardware needs, requires high tech components. Making low(er) end DRAM chips for consumer needs might be easier to get started with.

I am pretty sure, in the next year we will see a wave of low end ram components coming out of china.

Imustaskforhelp 10 hours ago | parent | next [-]

Yea I think the same too. China is notorious for price dumping but this might be good for the end consumer.

baobabKoodaa 10 hours ago | parent | prev [-]

Best of luck with your folksy mom & pop DRAM factory.

compounding_it 16 hours ago | parent | prev | next [-]

Software has gotten bad over the last decade. Electron apps were the start but these days everything seems to be so bloated, right from the operating systems to browsers.

There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.

Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.

I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.

I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.

heavyset_go 15 hours ago | parent | next [-]

Firefox is set to allocate memory until a certain absolute limit or memory pressure is reached. It will eat memory whether you have 4GB of RAM of 40GB.

Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`

And make sure tab unloading is enabled.

Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.

anigbrowl 15 hours ago | parent | prev | next [-]

Fair. I installed a MIDI composition app recently that was 1.2 GB! Now, it does have some internal synthesis that like uses samples, but only a limited selection of sounds so I think 95% of the bulk is from Electron.

rabf 5 hours ago | parent [-]

I used to to compose midi on Cubase on my Atari st with 512kB of ram. Probably had far more features and ran snappier to boot!

askonomm 7 hours ago | parent | prev [-]

I for one don't remember software ever being very good. Windows XP crashed all the time and needed frequent formats due to all the viruses (all antivirus software) that made it unusable. Win 2k was even worse. Old MacOS's also crashed frequently. Old Linux's had barely any hardware support. In fact while I agree that software is more bloated than ever, it also seems more stable than ever thinking back to what was before.

loudandskittish 13 hours ago | parent | prev | next [-]

Love all the variations of "8GB of RAM should be enough for anybody" in here.

walterbell 13 hours ago | parent [-]

AI PacMan eats memory, then promises to eat/write software so we need less memory.

emsign 4 hours ago | parent | prev | next [-]

Yeah, economic bubbles suck the life out of everything else. Imagine what tech doesn't get funding now because of this bet on LLMs.

ugh123 an hour ago | parent | prev | next [-]

You either like creating things or you like thinking about things.

jennyholzer3 3 hours ago | parent | prev | next [-]

The important question is: Who is planning to buy the datacenters and their hardware when these ludicrously overvalued AI companies are forced to shut doors?

memoriuaysj 21 hours ago | parent | prev | next [-]

the first stages of the world being turned into computronium.

next stage is paving everything with solar panels.

kylehotchkiss 16 hours ago | parent [-]

Solar freaking roadways reborn!

worik 30 minutes ago | parent | prev | next [-]

    AI companies are spending billions of dollars constructing data centers at warp speed around the world. It's the reason why Gogia says the demand for these chips isn't just a cyclical blip.
That makes no sense, except in the very short term.

The Datacentre building going on is clearly cyclic, the start of a cycle, but still a cycle. There are finite requirements, finite money and finite tolerance for loss.

RAM lead in times to ramp up production are long, but also finite.

This will correct, again. Hopefully in the meantime we learn to do more with less, always an innovation engine

sega_sai 3 hours ago | parent | prev | next [-]

So that makes it clearer how all these AI data centers will be payed for. They will be payed for by all of us paying more for the PCs, laptops and phones, while all the AI people arrange sweet deals guaranteeing low prices.

l9o 13 hours ago | parent | prev | next [-]

It feels like a weird tension: we worry about AI alignment but also want everyone to have unrestricted local AI hardware. Local compute means no guardrails, fine-tune for whatever you want.

Maybe the market pricing people out is accidentally doing what regulation couldn't? Concentrating AI where there's at least some oversight and accountability. Not sure if that's good or bad to be honest.

walterbell 13 hours ago | parent | next [-]

> market pricing people out

For now. Chinese supply chains include DRAM from CXMT (sanctioned) and NAND from YMTC (not sanctioned, holds patents on 3D stacking that have been licensed by Korean memory manufacturers).

kasabali 3 hours ago | parent [-]

They may not be sanctioned for selling, but they're sanctioned because they can't buy modern fab machinery.

ls612 5 hours ago | parent | prev [-]

The people who worry about “alignment” are very much not the same people who want anyone to have local AI hardware. They are the people who would force every computer legally allowed to be sold to the hoi polloi to be as locked down as an iPhone if they could.

abhi555shek 8 hours ago | parent | prev | next [-]

This is where government can jump in and take advantage of the less supply by opening large semi conductor factories quickly and helping people in avoiding this bottlneck.

wmf 3 hours ago | parent [-]

Opening large semi conductor factories quickly isn't possible. Governments also can't stomach risking ~$20B.

krick 3 hours ago | parent | prev | next [-]

Reminds me of my comment of 6 years ago: https://news.ycombinator.com/item?id=21581390

bfrog 6 hours ago | parent | prev | next [-]

AI has had a net negative on my life and created negative value for me and my family. It's raising my power rate. It's stealing my ideas and creative output. All to enrich a select few assholes who didn't need any more enrichment.

AI embodies everything wrong with our modern gilded age era of capitalism.

xbmcuser 14 hours ago | parent | prev | next [-]

Might not be the best thing for US but rest of the world needs China to reach parity on node size with TSMC to crash the market.

alecco 10 hours ago | parent | prev | next [-]

For a while CPUs kept getting significantly faster every year. Then ram kept getting bigger. Then NVMe came along and brought 200-1000x IOPS.

So writing optimized software was niche and overshadowed by the huge gains of hardware. People and corporations didn't care. They preferred fast feature delivery. Even when optimizing techniques like multi-tenant servers, we ended up having heavy containers wasting RAM and resources. And most apps switched to web frameworks like Electron where each one has its own huge web browser. Most people didn't care.

I hope this shortage has a silver lining of untangling the mess and refactoring the sea of bloat. But I know it's more likely some trick will be found to just patch it up a bit and only reduce the pain up to the new equilibrium level. For example something idiotic like Electron sharing resources across multiple apps and taking huge security risks. Corporations love to play false dichotomies to save pennies.

eb0la 8 hours ago | parent [-]

It is true that (most) people don't care, but... ... if you're beign charged by RAM (CPU is essentially free) ... ... AND you develop an application that's also your source of income ... THEN memory usage matters.

At least the first years.

ChoGGi 7 hours ago | parent | prev | next [-]

May rise? RAM has already gone through the roof, why wouldn't everything else?

HexPhantom 10 hours ago | parent | prev | next [-]

What's a bit unsettling is that this isn't a short spike driven by hype; it's a structural shift in demand

tim333 10 hours ago | parent [-]

If it's long term they can build more RAM factories. It may be a bit of a bubble though.

agilob 13 hours ago | parent | prev | next [-]

It's going to be interesting for Google Chrome team when new laptops will be equipped with 8Gb RAM by default.

johnea a day ago | parent | prev | next [-]

"May rise"?

Prices are already through the roof...

https://www.tomsguide.com/news/live/ram-price-crisis-updates

piskov 20 hours ago | parent | next [-]

Big companies secure long-term pricing (multi-year), so iPhones probably won’t feel this in 2026 (or even 2027).

2028 is another story depending on whether this frenzy continues / fabs being built (don’t know whether they are as hard as cpu)

Imustaskforhelp 21 hours ago | parent | prev [-]

Asus is ramping up production of ram...

So lets see if they might "save us"

jazzyjackson 21 hours ago | parent | next [-]

Asus doesn't operate fabs and has denied the rumor

https://www.tomshardware.com/pc-components/dram/no-asus-isnt...

Imustaskforhelp 14 hours ago | parent [-]

Hey sorry, I didn't knew that. I had watched the short form content (https://www.youtube.com/shorts/eSnlgBlgMp8) [Asus is going to save gaming] and I didn't knew that It was a rumour.

My bad

CamperBob2 21 hours ago | parent | prev | next [-]

Asus doesn't make RAM. That's the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.

nrp 21 hours ago | parent [-]

Three major ones: Micron, Samsung, SK Hynix

And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology

whaleofatw2022 18 hours ago | parent [-]

Is this glofo's time to shine?

bee_rider 17 hours ago | parent [-]

Do they make DRAM? I thought they made compute chips mostly.

If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.

FastFT 16 hours ago | parent [-]

They (GF) do not make DRAM. They might have an eDRAM process inherited from IBM, but it would not be competitive.

You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).

shevy-java 20 hours ago | parent | prev [-]

So far all I am seeing is an increase in prices, so any company claiming it will "ramp up production" here is, in my opinion, just lying for tactical reasons.

Governments need to intervene here. This is a mafia scheme now.

I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.

trinsic2 13 hours ago | parent [-]

Taxed extra is a good idea. But they bought our current administration so we all know that's not going to happen unless something big happens like Trump gets impeached, and all the criminals in congress go to prison. I'm wondering how likely that will happen. people need to get more directly involved in putting pressure on senators.

arjie 15 hours ago | parent | prev | next [-]

DRAM spot prices are something like what they were 4 years ago. Having RAM for cheap is nice. But it doesn't cost an extraordinary amount. I recently needed some RAM and was able to pick up 16x32 DDR4 for $1600. That's about twice as expensive as it used to be but $1600 is pretty cheap for 512 GiB of RAM.

A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.

p0w3n3d 11 hours ago | parent | next [-]

> A 16 GiB M4 Mac Mini is $400 right now

Where do you live? In Poland it's 740usd.

However, 16gb is very little for a Mac where the OS itself uses 7GB

llukas 10 hours ago | parent [-]

It is perfectly usable for essential use cases. This thing also has very fast swap and is good at it.

chzblck 15 hours ago | parent | prev | next [-]

you running a server or local llms with a need for 512?

arjie 15 hours ago | parent [-]

Server stuff. Nothing interesting. Supermicro H11 + Epyc 7xxx + RAM. I have a 6x4090 setup for local LLMs and I got myself a 128 GB M4 Max laptop thinking I'd do that, but if I'm being honest I need to get rid of that hardware. It's sitting idle because the SOTA ones are so much better for what I want.

sgerenser 12 hours ago | parent | prev [-]

$400 or $499?

deadbabe 16 hours ago | parent | prev | next [-]

Are we finally going to be forced to use something like CollapseOS, when the supply chains can no longer deliver chips to the masses?

kankerlijer 21 hours ago | parent | prev | next [-]

Well thank th FSM that the article opens right up with buy now! No thanks, I'm kind of burnt out on mindless consumerism, I'll go pot some plants or something.

johnea 21 hours ago | parent [-]

I didn't see any of that.

I highly recommend disabling javascript in your browser.

Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).

And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.

piskov 20 hours ago | parent | next [-]

Probably using reader mode by default would be less guttural experience (and you’ll have an easy fallback).

intrasight 8 hours ago | parent [-]

Used to work. No longer. What does work is archive.today. But even that is at risk. Some sites now presented encoded text when you view the archive.

zahlman 18 hours ago | parent | prev | next [-]

I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).

But I use NoScript and it is definitely a big help.

metadope 20 hours ago | parent | prev [-]

I whole-heartedly agree with your recommendation and join in encouraging more adopters of this philosophy and practice.

Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.

Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!

In any case, Happy New Year! [alpha preview release]

derelicta 10 hours ago | parent | prev | next [-]

I'd like to see a State-owned memory manufacturer

kotaKat 10 hours ago | parent | prev | next [-]

Does this mean developers will have to make their software work on hardware with 8GB of RAM and stop blindly assuming every person in the world gets a packed developer workstation like they do?

intrasight 8 hours ago | parent [-]

In an idealized market economy that would be the case. But it is far from ideal. Your individual incentive as a SWE is to write more code.

netbioserror 21 hours ago | parent | prev | next [-]

Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.

yooogurt 20 hours ago | parent | next [-]

Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.

E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.

If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.

Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.

linguae 19 hours ago | parent [-]

I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.

This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.

The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.

I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.

piskov 20 hours ago | parent | prev | next [-]

Some Soviet humor will help you understand the true course of events:

A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”

ip26 17 hours ago | parent | prev [-]

I have some hope for transpiling to become more commonplace. What would happen if you could write in Python, but trivially transpile to C++ and back?

disqard 10 hours ago | parent | prev | next [-]

The free market will surely fix this, amirite?

Maybe this is the free market working as intended -- did you know that RAM is actually a luxury item, like a Rolls Royce, and most plebes should just make do with 4gb machines, because that is the optimum solution!

bfrog 5 hours ago | parent [-]

Make all RAM the same density, lease out extra RAM through software. Rent all the things. Blackrock demands it.

shevy-java 20 hours ago | parent | prev | next [-]

I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.

DamnInteresting 19 hours ago | parent | next [-]

When it's more than one company working together in a monopoly-like fashion, the term is "oligopoly".

https://www.merriam-webster.com/dictionary/oligopoly

vee-kay 19 hours ago | parent [-]

There is another word for it: cartel.

e.g., the Phoebus cartel https://en.wikipedia.org/wiki/Phoebus_cartel

throwaway94275 20 hours ago | parent | prev | next [-]

"Monopoly" means one seller, so you can't say multiple X makes a monopoly and make sense. You probably mean collusion.

If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.

If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.

The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.

rileymat2 16 hours ago | parent [-]

> If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.

If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?

kaoD an hour ago | parent [-]

> If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?

...and why it has been consistently the case for a long while.

yupyupyups 20 hours ago | parent | prev | next [-]

Why would government officials and politicians want to stop making money?

yowlingcat 19 hours ago | parent | prev | next [-]

Technically it's a lot closer to monopsony (Sam Altman/OAI cornering 40% of the market on DRAM in a clever way for his interests that harms the rest of the world that would want to use it). I keep hoping that somehow necessity will spur China to become the mother of invention here and supply product to serve the now lopsided constrained supply given increasing demand but I just don't know how practical it will be.

klooney 16 hours ago | parent | prev | next [-]

I mean, if you were the Micron CEO, would you bet the company on demand sustaining from AI? It seems like it could all go belly up very fast.

ekianjo 18 hours ago | parent | prev [-]

There is no monopoly in AI. I can name at least 10 big actors worldwide.

ggm 16 hours ago | parent [-]

If they collude on pricing and restrict new entrants, that's what the Sherman anti trust laws are about.

bdangubic 16 hours ago | parent [-]

the fines that would be levied via potential sherman law violations would negligible so that is for sure not a deterrent

ggm 12 hours ago | parent [-]

It would be understood that any action under the sherman act is unlikely and as you say, the financial penalties are tokenistic.

The non financial parts, which include mandated restructuring and penalties to directors including incarceration however, are not tokenistic. They'd be appealed and delayed, but at some point the shareholders would seek redress from the board. Ignoring judicial mandated instructions isn't really a good idea, current WH behaviour aside. If the defence here is "courts don't matter any more" that's very unhelpful, if true. At some point, a country which cannot enforce judicial outcomes has stopped being civil society.

My personal hope the EU tears holes in the FAANG aside, the collusive pricing of chips has been a problem for some time. The cost/price disjunction here is strong.

shmerl 20 hours ago | parent | prev | next [-]

> She said the next new factory expected to come online is being built by Micron in Idaho. The company says it will be operational in 2027

Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.

terribleperson 20 hours ago | parent [-]

Micron is exiting direct to consumer sales. That doesn't mean their chips couldn't end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.

Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.

walterbell 13 hours ago | parent [-]

> no-middleman Crucial brand is dead

It could be restarted in the future by Micron.

Crucial SSDs offer good firmware (e.g. nvme sanitize for secure erase) and hardware (e.g. power loss capacitors).

vittore 21 hours ago | parent | prev | next [-]

I've been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!

shevy-java 20 hours ago | parent | next [-]

I don't think any of this is "fascinating" - it is more of a racket scheme. They push the prices up. Governments failed the people here.

yooogurt 20 hours ago | parent [-]

Isn't this more easily explained by supply-demand? Supply can't quickly scale, and so with increased demand there will be increased prices.

ozgrakkurt 15 hours ago | parent [-]

Imagine someone goes to the supermarket and buys all the tomatoes. Then supermarket owner says I don’t know, he bought all at once so it is a better sale. And he sells the remaining 10% of tomatoes at a huge markup

vittore 14 hours ago | parent [-]

I think it is better compared to Dutch folks buying all the tulip bulbs. And the price skyrocketed.

Ekaros 11 hours ago | parent [-]

Tulips were by my understanding more so NFTs. Rich people gambling when bored. With promises for tulips in future... Future contracts for tulips. And prices were high because they were insanely rich merchants.

The RAM looks like cornering market. Probably something OpenAI should be prosecuted for if they end up profiting from it.

squibonpig 16 hours ago | parent | prev [-]

Except it's still sitting idle in warehouses while datacenters get built. They aren't running yet. Unlike with fiber, GPUs degrade rapidly with use, and for now datacenters need to be practically rebuilt to fit new generations, so we shouldn't expect much reusable hardware to come from this

Culonavirus 11 hours ago | parent | prev | next [-]

I mean yea, but this is THE wrong site to post stuff like this. Half the people here are the AI cock and the other half is riding it.

CTDOCodebases 10 hours ago | parent | prev | next [-]

"May"

29athrowaway 19 hours ago | parent | prev | next [-]

AI needs data and data that comes from consumer devices.

cglan 19 hours ago | parent | prev | next [-]

At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.

Whether you like it or not, AI right now is mostly

- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs

and the benefits are mostly - slop and chatgpt

Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.

caconym_ 15 hours ago | parent | next [-]

> they'll quickly be wondering if ChatGPT is worth this cost

They should be, and the answer is obviously no—at least to them. No political or business leader has outlined a concrete, plausible path to the sort of vague UBI utopia that's been promised for "regular folks" in the bullish scenario (AGI, ASI, etc.), nor have they convincingly argued that this isn't an insane bubble that's going to cripple our economy when AGI doesn't happen—a scenario that's looking more and more likely every day.

There is no upside and only downside; whether we're heading for sci-fi apocalypse or economic catastrophe, the malignant lunatics pushing this technology expect to be insulated from consequences whether they end up owning the future light-cone of humanity or simply enjoying the cushion of their vast wealth while the majority suffers the consequences of an economic crash a few rich men caused by betting it all, even what wasn't theirs to bet.

Everybody should be fighting this tooth and nail. Even if these technologies are useful (I believe they are), and even if they can be made into profitable products and sustainable businesses, what's happening now isn't related to any of that.

zaptheimpaler 19 hours ago | parent | prev | next [-]

I hope they do. We live in a time of incredibly centralized wealth & power and AI and particularly "the machine god" has the potential to make things 100x worse and return us to a feudal system if the ownership and profits all go to a few capital owners.

trinsic2 13 hours ago | parent [-]

IMHO this is exactly what is happening. Everyone should be on the phone with there senators putting pressure to enforce anti-trust and deal with citizens united

OGEnthusiast 15 hours ago | parent | prev | next [-]

> At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.

Not saying this is necessarily a bad prediction for 2028, but I'm old enough to remember when the 2020 election was going to be a referendum on billionaires and big tech monopolies.

stefan_ 18 hours ago | parent | prev [-]

For good measure, a bunch of this is funded through money taken directly from the electorates taxes and given to a few select companies, whose leaders then graciously donate to the latest Ballroom grift. Micron, so greedy they thought nothing of shutting down their consumer brand even when it costs them nothing at all, got $6B in Chips Act money in 2024.

arnaudsm 11 hours ago | parent | prev | next [-]

[deleted]

fartfeatures 11 hours ago | parent [-]

This take is pure Luddite nonsense. AI "lowering labor value while boosting capital" ignores centuries of automation: productivity gains cut costs, expand markets, create new jobs, and raise real wages despite short-term disruption.

Steam engines, electricity, computers displaced workers but spawned far more opportunities through new industries and cheaper goods. Same pattern now.

The "jobless masses stuck with 1GB phones eating slop" fantasy is backwards. Compute keeps getting vastly cheaper and more capable; AI speeds that up.

"Terrible for indie creators and startups"? The opposite: AI obliterates barriers to building, shipping, and competing. Solo founders are moving faster than ever.

It's the same tired doomer script we get with every tech wave. It ages poorly.

Inityx 8 hours ago | parent | next [-]

> Compute keeps getting vastly cheaper and more capable; AI speeds that up.

???

fartfeatures 5 hours ago | parent [-]

https://deepmind.google/blog/how-alphachip-transformed-compu...

Among others.

sidibe 8 hours ago | parent | prev [-]

None of the previous tech had the potential to do every economically productive thing we can do. It will spawn more opportunities, but maybe it will also fill those opportunities.

throwaway743 4 hours ago | parent | prev | next [-]

This might be a little conspiracy thinking, but I think it's possible that the recent drive from NVIDIA to move toward defense contracting and shift to B2B is part of a larger strategy. When you combine that with the rapid increases in hardware pricing and the push toward renting cloud compute, it looks like a form of collusion between the private sector and the government in the name of "national security".

The goal seems to be to squash the proliferation of open source LLMs and prevent individuals from running private, uncensored models at home. It is an effective way to kill any possible underdog or startup competition by making the "barrier to entry" (the compute) a rented privilege rather than a private resource. The partnership with Palantir seems to point directly to this, especially considering the ideologies of Thiel and Karp.

They are building a world where intelligence is centralized and monitored, and local sovereignty over AI is treated as a liability to be phased out

czhu12 15 hours ago | parent | prev | next [-]

It seems… fine? Hasn’t DRAM always been a boom and bust industry with no real inflation — in fact massive deflation — over the past 30 years?

Presumably the boom times are the main reason why investment goes into it so that years later, consumers can buy for cheap.

bilsbie 8 hours ago | parent | prev [-]

Take this to the limit and humans end up as a primitive, agrarian society and AI does its own thing with industry.