Remix.run Logo
barrkel 4 hours ago

I don't buy the central thesis of the article. We won't be in a supply crunch forever.

However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.

This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.

I could sell the RAM alone now for the price I paid for it.

raincole 3 hours ago | parent | next [-]

We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.

People who are willing to drop $20k on a computer might not be affected much tho.

TeMPOraL 3 hours ago | parent | next [-]

> People who are willing to drop $20k on a computer might not be affected much tho.

They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).

m3nu 3 hours ago | parent | prev | next [-]

My bet is that phone hardware will be used more and more in mini PCs and laptops keeping the cost down and volume up. We see it with Apple and many Chinese mini PC makers I looked at.

riskable 15 minutes ago | parent | next [-]

If this ends up being true, desktop Linux adoption might make inroads. Windows apps run like crap on ARM and no one is bothering to make ARM builds of their software.

Lord-Jobo 17 minutes ago | parent | prev [-]

Unified hardware helps some and hurts some. See: same gpus for gaming and for AI.

bparsons 2 hours ago | parent | prev | next [-]

The problem is that there is a very large incentive for three large companies to corner the market on computing components, forcing consumers to rent access instead of owning.

molszanski 3 hours ago | parent | prev | next [-]

> We won't be in a supply crunch forever.

This what always happens in capitalism. Scarcity is almost always followed by glut

drecked 3 hours ago | parent [-]

I don’t believe we are seeing the investments necessary that would indicate this will happen.

Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.

Thats likely because they don’t expect this demand to last past a few years.

fmajid 2 hours ago | parent | next [-]

They have seen boom and bust cycles previously and are understandably wary of expanding capacity for expected demand that may fizzle. If they stay too conservative, China’s CXMT is chomping at the bit to eat their lunch, backed by the Chinese government, but that’s not going to help until late 2027 at best.

bitmasher9 3 hours ago | parent | prev [-]

If the demand lasts for a few years, I’m doubtful that all of the consumer capacity will come back.

close04 2 hours ago | parent | prev [-]

> We'll have a demand crunch

This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.

I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.

Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.

HKH2 2 hours ago | parent [-]

> I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform.

Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.

close04 7 minutes ago | parent | next [-]

I think we're talking about 2 different things. I'm not sure where Roblox fits into what I said.

Nvidia was just an example of this because they stand out. GPUs are so expensive now that many gamers were eying GeForce Now as a viable long term solution for gaming. Just recently there was a discussion on HN about GeForce Now where a lot of comments were "I can pay for 10 years of GeForce Now with the price of a 5090, and that's before counting electricity". All upsides, right?

In parallel Nvidia is probably seeing more money in the datacenter market so would rather focus the available production capacity there. Once enough gamers move away from local compute, the demand is unlikely to come back so future generations of GPUs would get more and more expensive to cater for an ever shrinking market. This is the vicious cycle. Expensive GPU + cheap cloud gaming > shrinking GPU market and higher GPU prices > more of step 1.

Roblox is one example of a game, there are many popular games that aren't graphics intensive or don't rely on eye candy. But what about all the other games that require beefy GPU to run? Gamers will want to play them, and Nvidia like most other companies sees more value in recurring revenue than in one time sales. A GPU you own won't bring Nvidia money later, a subscription keeps doing that.

hombre_fatal 2 hours ago | parent | prev | next [-]

Yeah, this gamer conspiracy theory never made sense to me.

Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.

But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.

foobarian 2 hours ago | parent | prev [-]

I love it when I get my Robloxhead daughter to test drive some of the games I play on my 5090 box. "Ooooh these graphics are unreal" "Can we stop for just a moment and admire this grass" :-D

kace91 4 hours ago | parent | prev | next [-]

The thing is, other than AI stuff, where does a non powerful computer limit you?

My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.

I'm not arguing mind you, just trying to understand the usecases people are thinking of here.

zozbot234 4 hours ago | parent | next [-]

> other than AI stuff, where does a non powerful computer limit you?

Running Electron apps and browsing React-based websites, of course.

tormeh 3 hours ago | parent | next [-]

For real. Once I've opened Spotify, Slack, Teams, and a browser about 10GB of RAM is in use. I barely have any RAM left over for actual work.

foobarian 2 hours ago | parent | next [-]

I keep wondering why we can't have 2000s software on today's hardware. Maybe because browsers are de facto required to build apps?

lo_zamoyski an hour ago | parent | prev | next [-]

Seems like the perfect target for ESG.

skydhash 3 hours ago | parent | prev [-]

That’s why I only run those on work computers (where they are mandated by the company). My personal computers are free of these software.

lpcvoid 3 hours ago | parent [-]

I rarely doge a chance to shit on Microslop and its horrible products, but you don't use a browser? In fact, running all that junk in a single chromium instance is quite a memory saver compared to individual electron applications.

FeepingCreature an hour ago | parent | next [-]

fun fact, you can kill all firefox background processes and basically hand-crash every tab and just reload the page in the morning. I do this every evening before bed. `pkill -f contentproc` and my cpu goes from wheezing to idle, as well as releasing ~8gb of memory on busy days.

("Why don't you just close firefox?" No thanks, I've lost tab state too many times on restart to ever trust its sessionstore. In-memory is much safer.)

Fnoord 16 minutes ago | parent [-]

Yeah, I found this out the other day when my laptop was toasting. In hindsight, probably related to archive.today or some Firefox extension.

You have to close Firefox every now and then for updates though. The issue you describe seems better dealt with on filesystem level with a CoW filesystem such as ZFS. That way, versioning and snapshots are a breeze, and your whole homedir could benefit.

bluGill 2 hours ago | parent | prev | next [-]

I use a browser at home, but I don't use the heaviest web sites. There are several options for my hourly weather update, some are worse than others (sadly I haven't found any that are light weight - I just need to know if it would be a thunderstorm when I ride my bike home from work thus meaning I shouldn't ride in now)

Fnoord 12 minutes ago | parent | next [-]

Yr.no [1] is free, and available in English. Thanks to Norway. Apps available as well.

https://en.wikipedia.org/wiki/Yr.no

lpcvoid 2 hours ago | parent | prev | next [-]

Try Quickweather (with OpenMeteo) if you're on Android. I love it.

https://f-droid.org/en/packages/com.ominous.quickweather/

TeMPOraL 2 hours ago | parent | prev [-]

I'm giving up on weather apps bullshit at this point, and am currently (literally this moment) making myself a Tasker script to feed hourly weather predictions into a calendar so I can see it displayed inline with events on my calendar and most importantly, my watch[0] - i.e. in context it actually matters.

--

[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.

skydhash 3 hours ago | parent | prev [-]

Why would I need a browser to play music? Or to send an email? Or to type code? My browser usage is mostly for accessing stuff on someone else’s computer.

lpcvoid 2 hours ago | parent | next [-]

The only subscription I have is Spotify, since there's no easy way that I know of to get the discoverability of music in a way that Spotify allows it.

For the rest: I agree with you.

sys_64738 an hour ago | parent | prev [-]

Plex or Jellyfin client access.

teeray 3 hours ago | parent | prev | next [-]

Companies love externalizing the costs of making efficient software onto consumers, who need to purchase more powerful computing hardware.

vladvasiliu 2 hours ago | parent [-]

If only. At work I've got a new computer, replacing a lower-end 5-yo model. The new one has four times the cores, twice the RAM, a non-circus-grade ssd, a high-powered cpu as opposed to the "u" series chip the old one has.

I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.

sfn42 2 hours ago | parent [-]

Yeah people love to shit on electron and such but they're full of crap. It doesn't matter one bit for anything more powerful than a raspberry pi. Probably not even there. "Oh boo hoo chrome uses 2 gigs of ram" so what you have 16+ it doesn't matter. I swear people have some weird idea that the ideal world is one where 98% of their ram just sits unused, like the whole point of ram is to use it but whenever an application does use it people whine about it. And it's not even like "this makes my pc slow" it's literally just "hurr durr ram usage is x" okay but is there an actual problem? Crickets.

dgb23 an hour ago | parent | next [-]

I have no issues with browsers specifically having to use a bunch of resources. They are complicated as fuck software, basically it's own operating system. Same for video games or programs that do heavy data processing.

The issue is with applications that have no business being entitled to large amount of resources. A chat app is a program that runs in the background most of the time and is used to sporadic communication. Same for music players etc. We had these sorts of things since the 90's, where high end consumer PCs hat 16mb RAM.

gryfft 2 hours ago | parent | prev | next [-]

"chrome uses 2gb of ram"

these days individual _tabs_ are using multiple gb of ram.

duskdozer an hour ago | parent | prev | next [-]

The web browser on my phone instantly gets killed the moment I switch to another app because it eats up so much ram.

interf4ce 2 hours ago | parent | prev | next [-]

The issue isn't usage, it's waste. Every byte of RAM that's used unnecessarily because of bloated software frameworks used by lazy devs (devs who make the same arguments you're making) is a byte that can't be used by the software that actually needs it, like video editing, data processing, 3D work, CAD, etc. It's incredibly short sighted to think that any consumer application runs in a vacuum with all system resources available to it. This mindset of "but consumers have so much RAM these days" just leads to worse and worse software design instead of programmers actually learning how to do things well. That's not a good direction and it saddens me that making software that minimizes its system footprint has become a niche instead of the mainstream.

tl;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.

sfn42 an hour ago | parent [-]

I dunno man, I have 32gb and I'm totally fine playing games with 50 browser tabs open along with discord and Spotify and a bunch of other crap.

In not trying to excuse crappy developers making crappy slow ad wasteful apps, I just don't think electron itself is the problem. Nor do I think it's a particularly big deal if an app uses some memory.

vladvasiliu 2 hours ago | parent | prev [-]

I think it's a correlation vs causation type thing. Many Electron apps are extremely, painfully, slow. Teams is pretty much the poster child for this, but even spotify sometimes finds a way to lag, when it's just a freaking list of text.

Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.

sfn42 an hour ago | parent [-]

Nearly all apps are sluggish for a very clear reason - the average dev is ass. It's possible to make fast apps using electron, just like it's possible to make fast apps using anything else. People complain about react too, react is fast as fuck. I can make react apps snappy as hell. It's just crappy devs.

Esophagus4 3 hours ago | parent | prev | next [-]

It seems like as hardware gets cheaper, software gets more bloated to compensate. Or maybe it’s vice versa.

I wonder if there’s a computer science law about this. This could be my chance!

mjmas 3 hours ago | parent | next [-]

Is your name Wirth?

Esophagus4 18 minutes ago | parent [-]

Dangit! Always the bridesmaid, never the bride

2 hours ago | parent | prev | next [-]
[deleted]
daveguy 3 hours ago | parent | prev [-]

Sorry to burst your bubble:

https://en.wikipedia.org/wiki/Wirth%27s_law

Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.

Esophagus4 12 minutes ago | parent [-]

Ah, so you think there’s a point where actually bloat slows because we eventually can’t keep up with demand for compute?

I guess this might be happening with LLMs already

mettamage 3 hours ago | parent | prev [-]

This is the way

swiftcoder 3 hours ago | parent | prev | next [-]

The big one for me is ballooning dependency trees in popular npm/cargo frameworks. I had to trade a perfectly good i9-based MacBook Pro up to an M2, just to get compile times under control at work.

The constant increases in website and electron app weight don't feel great either.

WillAdams 2 hours ago | parent | prev | next [-]

3D CAD/CAM is still CPU (and to a lesser extent memory) bound --- I do joinery, and my last attempt at a test joint for a project I'm still working up to was a 1" x 2" x 1" area (two 1" x 1" x 1" halves which mated) which took an entry-level CAM program some 18--20 minutes to calculate and made a ~140MB file including G-code toolpaths.... (really should have tracked memory usage....)

mbfg 2 hours ago | parent | prev | next [-]

I've never have a personal computer that came even close to powerful enough to do what i want. Compiles that take 15 minutes, is really annoying for instance.

duskdozer 2 hours ago | parent | prev | next [-]

>My phone has 16gigs of ram and a terabyte of storage

That's "non powerful" to you?

kace91 2 hours ago | parent [-]

The opposite. I meant that if this is what consumer grade looks like nowadays, even with a fraction of current flagships we seem well covered - this was less than 800 bucks.

wat10000 an hour ago | parent | prev | next [-]

I’d love it if a clean build and test on the biggest project I work in would finish instantly instead of taking an hour.

diablevv an hour ago | parent | prev [-]

[dead]

TheRoque an hour ago | parent | prev | next [-]

We are on borrowed time, most of the world is running on oil and this resource is not unlimited at all. A lot of countries have gone past their production peak, meaning it's only downhill from here. Everything is gonna be more costly, more expensive, our lavish "democracies" lifestyles are only possible because we have (had) this amazing freely available resource, but without it it's gonna change. Even at a geopolitical scale you can see this pretty obviously, countries that talked about free market, free exchange are now starting to close the doors and play individually. Anyways, my point is, we are in for decades, if not a century of slow decline.

margalabargala an hour ago | parent [-]

Doubt it. Renewables are expanding much faster than oil output is decreasing. Wind and solar will enable energy to remain cheap everywhere that builds it.

Lord-Jobo 13 minutes ago | parent | next [-]

There will be very dramatic growing pains with this switch, especially for A: nations manufacturing renewables but still running that manufacturing on oil and B: nations that face political and economic barriers for renewables.

Also C: nations that are both A and B, needlessly causing oil volatility with unplanned military dickheadedness.

lo_zamoyski an hour ago | parent | prev | next [-]

Malthusians has been sounding the alarm for longer than Protestant revivalists have been claiming the end of world is next month at lunchtime. If there is a predication market for such things, betting on any Malthusian is patently foolish.

(Of course, I don't disagree with the notion that consumerism produces an extraordinary amount of worthless trash, but that's a different matter. The main problem with consumerism is consumerism itself as a spiritual disease; the material devastation is a consequence of that.)

antisthenes 19 minutes ago | parent [-]

People gloating about Malthusians being wrong keep forgetting that it only takes for them to be right ONCE in the entirety of human history and when they are - you'll be too busy trying to survive rather than posting on internet forums.

The planet has a certain resource-bound carrying capacity. It's a fact of physics. Just because we aren't there yet as of (checks time) 2026-03-27, doesn't mean Malthusians are wrong.

Although to be fair to the other side, I think with abundant renewable energy we'll be able to delay resource depletion for a very long time thanks to recycling (and lower standards of living of course).

TheRoque an hour ago | parent | prev | next [-]

Renewables provide electricity only, but planes, boats, trucks, basically all the supply chain, works with oil only for the moment. The ease of use of oil has not been replaced yet. Do you realize how easy it is to handle oil ? You can just put it in a barrel and ship it anywhere in that barrel. No need for wires or complex batteries like for electricity, nor complex pipelines like for gas.

And even if we figured out how to electrify everything (which we didn't as I just said), we would still run into resources shortages for batteries, wires (copper etc.), nuclear fuel (uranium)...

margalabargala 6 minutes ago | parent | next [-]

Expanding renewables to the easily replaceable items like power plants, generators, and most consumer vehicles would radically reduce oil usage to where it becomes a minor concern. Also things like biodiesel exist. A more sustainable, renewable-forward, electrified reality is easily possible.

There is not a risk of resource shortage of copper. The doomer and prepper talking points you're parroting are not based in reality.

kogasa240p 36 minutes ago | parent | prev [-]

Keep your eye on butanol https://phys.org/news/2026-02-microbial-eco-friendly-butanol....

globalnode 43 minutes ago | parent | prev [-]

Hilarious how comments like this consistently get downvoted, theres a lot of special interest lurkers on this forum

guessmyname 4 hours ago | parent | prev | next [-]

> I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]

768GB of RAM is insane…

Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.

rafaelmn 3 hours ago | parent | next [-]

Your battery is going to suffer because of the extra ram as well.

I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.

zozbot234 3 hours ago | parent [-]

> Your battery is going to suffer because of the extra ram as well.

No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.

cduzz 2 hours ago | parent | next [-]

I suspect this is one of those "it depends" situations; does the 128gb vs 64gb sku have more chips or denser chips? If "more chips" probably it'll draw a tiny bit more power than the smaller version. If the "denser" chips, it may be "more power draw" but such a tiny difference that it's immaterial.

Similarly, having more cache may mean less SSD activity, which may mean less energy draw overall.

If I had a chip to put on the roulette table of this "what if" I'd put it on the "it won't make a difference in the real world in any meaningful way" square.

3form 3 hours ago | parent | prev [-]

Given the DRAM refresh is part of S3 standby, I'm afraid this is circular reasoning.

barrkel 4 hours ago | parent | prev | next [-]

Look at the way age gating is going in a global coordinated push. Can control of compute be far behind?

It wasn't my primary motivator but it hasn't made me regret my decision.

I hummed and hawed on it for a good few months myself.

WillAdams 2 hours ago | parent | next [-]

Just look at ITAR and the various attempts at legislating 3D printing and CNC machining of firearms parts to see one justification point of that.

jusssi 3 hours ago | parent | prev [-]

> Can control of compute be far behind?

How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.

cesarb 2 hours ago | parent [-]

> How is this going to work? You need uncontrolled compute for developing software.

I've read about companies where all software developers have to RDP to the company's servers to develop software, either to save on costs (sharing a few powerful servers with plenty of RAM and CPU between several developers) or to protect against leaks (since the code and assets never leave the company's Citrix servers).

gzread 4 hours ago | parent | prev | next [-]

> 768GB of RAM is insane.

Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.

You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.

Apple hardware is incredibly overpriced.

FpUser 3 hours ago | parent | prev [-]

My home server has 512GB RAM, 48 cores, my 4 desktops are 16 cores 128GB, 4060GPU each. Server is second hand and I paid around $2500 for it. Just below $3000 price for desktops when I built them. All prices are in Canadian Pesos

xyzsparetimexyz 3 hours ago | parent [-]

Canadian Pesos?

doubled112 3 hours ago | parent | next [-]

Jokes because the Canadian dollar’s value isn’t very high right now.

See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.

A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.

heffer 3 hours ago | parent | prev | next [-]

Please, it's actually Cambodian Dollhairs or Canuckistan Pesos.

FpUser 3 hours ago | parent | prev [-]

It is sarcasm. Our dollar which used to be on par with US is no more.

NoSalt an hour ago | parent | prev | next [-]

> "I personally dropped $20k on a high end desktop"

This absolutely boggles my mind. Do you mind if I ask what type of computing you do in order to justify this purchase to yourself?

distances 4 minutes ago | parent [-]

I'm thinking the same. My total computing purchases in the last 25 years, including desktops, laptops, monitors, phones, and tablets is way under 20k.

I would bet it continues to be more affordable to buy reasonable specs with current consumer hardware, rather than buying a top system once.

motbus3 2 hours ago | parent | prev | next [-]

I believe superficially speaking you could be right. But I think it was realised that causing the scarcity of products and commodities is a power move.

We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...

This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale

windexh8er an hour ago | parent [-]

It's not a power move, it's a cartel and they've done this before. Gamers Nexus did a fantastic piece on how where we're at today is very similar to the DRAM price fixing and market manipulation just a couple decades ago [0]. This is the big players taking full advantage of an opportunity for profit.

[0] https://youtu.be/jVzeHTlWIDY?si=cRJ6C7jPxLIpKTyF

Aurornis 2 hours ago | parent | prev | next [-]

> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.

How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?

Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.

embedding-shape an hour ago | parent | next [-]

They're ultimately laptops, you won't be able to squeeze out the same amount of performance from a laptop compared to a desktop, regardless of the hardware.

If you haven't tried out a desktop CPU in a while, I highly recommend you giving it a try if you're used to only using laptops, even when in the same class the difference is obvious.

chmod775 an hour ago | parent | prev [-]

They're fast, but they'll never even remotely reach what a mid-range desktop PC with dedicated graphics burning 500W is able to do.

A 300W GPU released in 2025 is about 10x M5 perf. The difference is going to be smaller for CPU perf, but also not close.

picture 4 hours ago | parent | prev | next [-]

It seems like you largely agree with the article - people shall own nothing and be happy. Perhaps the artificially induced supply crunch could go on indefinitely.

Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?

Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.

bluedino an hour ago | parent | next [-]

I've always went way over on RAM, for the most part. 32, 64, then 128GB of memory.

Never really used it all, usually only about 40%, but it's one of those better to have than not need, and better than selling and re-buying a larger memory machine (when it's something you can't upgrade, like a Mac or certain other laptops)

barrkel 4 hours ago | parent | prev | next [-]

People spend a lot more than that on a car they use less, especially if they're in tech.

The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.

That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.

sfn42 2 hours ago | parent [-]

I bought a 4 year old car for significantly less than that. And I can get a computer that can do 99% of what your monster can do for like 10% of the price. And if I want LLM inference I can get that for like $20 a month or whatever.

I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.

barrkel an hour ago | parent [-]

Sure. You're right, it is my money. And I pay even more for inference on top; I have OpenRouter credits, OpenAI subscription, Claude Max subscription.

It's not so easy to get nice second-hand hardware here in Switzerland, and my HEDT is nice and quiet, doesn't need to be rack-mounted, plugs straight into the wall. I keep it in the basement next to the internet router anyway.

The "sensible" choice is to rent. It's the same with cars; most people these days lease (about 50% of new cars in CH, which will be a majority if you compare it with auto loan and cash purchase).

sfn42 an hour ago | parent [-]

I don't think leasing cars is sensible. Last time I checked, for cheaper cars mind you, I would essentially pay 60% of the sticker price over a few years and then not have a car at the end of it. Would be better to buy a new car and then sell it after the same time. But what's even better is to not buy a new car, let some other sucker take the huge value loss and then snatch it up at a 30-60% discount a few years later. Then you can sell it a few years after that for not much less than you paid for it. I've had mine a year and right now they're going for more than I paid.

I think leasing might be okayish if you find a really good deal, but it's really not much different than buying new which is just a shit deal no matter how you turn it. A 1-4 year old car is pretty much new anyway, I don't see any reason to buy brand new.

9wzYQbTYsAIc 3 hours ago | parent | prev [-]

Solarpunk + https://permacomputing.net/

That’s for everyone

kgeist 3 hours ago | parent | prev | next [-]

>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.

Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?

girvo 2 hours ago | parent [-]

Blackwell diverges within Blackwell itself… SM121 on the GB10 vs the RTX 5000 consumer vs the actual full fat B100 hardware all have surprisingly different abilities. The GB10 has been hamstrung by this a bit, too.

porkeynon 2 hours ago | parent | prev | next [-]

$20k?

People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.

barrkel 2 hours ago | parent | next [-]

Alas, I'm not a young man any more. And my HEDT is headless, it has no monitor with which to play FPSes.

embedding-shape an hour ago | parent | prev [-]

[dead]

dmitrygr 41 minutes ago | parent | prev | next [-]

> Laptops are increasingly just clients for someone else's compute

Are you kidding? Apple's mobile chips are now delivering perf that AMD & intel desktop never could or did.

danaris an hour ago | parent | prev | next [-]

> Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

What are you talking about?

My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.

If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.

The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.

It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.

shevy-java 3 hours ago | parent | prev | next [-]

> We won't be in a supply crunch forever.

I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.

It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)

SirMaster 40 minutes ago | parent | next [-]

The increase looks higher because we were at an all-time price low. RAM has been this expensive at least twice before, and it always dropped way down again after.

bluGill 2 hours ago | parent | prev [-]

General predictions are in 3-5 years things will return to normal. 3 years if the current AI crunch is a short term thing, 5 years if it isn't and we have to build new RAM factories.

echelon 3 hours ago | parent | prev [-]

Local is a dead end.

Open source efforts need to give up on local AI and embrace cloud compute.

We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.

When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.

If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.

If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.

Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.

An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.

That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.

zozbot234 3 hours ago | parent | next [-]

> We need open weights models that are big and run on H200s.

We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.

lpcvoid 3 hours ago | parent | prev | next [-]

> Open source efforts need to give up on local AI and embrace cloud compute.

Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?

nhecker 2 hours ago | parent | next [-]

In a similar vein: seek efficiency.

I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)

Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.

zozbot234 2 hours ago | parent [-]

But commodity hardware that's right-sized for your own private needs is many orders of magnitude cheaper than datacenter hardware that's intended to serve millions of users simultaneously while consuming gigawatts in power. You're mostly paying for that hardware when you buy LLM tokens, not just for power efficiency. And your own hardware stays available for non-AI related needs, while paying for these tokens would require you to address these needs separately in some way.

nhecker an hour ago | parent | next [-]

>And your own hardware stays available for non-AI related needs, while paying for these tokens would require you to address these needs separately in some way.

^ Fair. Yep, I agree the calculus changes if you don't have _any_ local hardware and you're needing to factor in the cost of acquiring such hardware.

When I did this napkin math, I was mostly interested in the energy aspect, using cost as a proxy. I was calculating the $/token (taking into consideration the cost of a KWh from my utility, the measured power draw of my M1 work machine, and the measured tokens per second processed by a ~20BP open-weight model). I then compared this to the published $/token rate of a frontier provider, and it was something like two orders of magnitude in favor of the frontier model. I get it, they're subsidizing, but I've got to imagine there's some truth in the numbers.

I wonder, does (or will) the $/token ratio fall asymptotically toward the cost of electricity? In my mind I'm drawing a parallel to how the value of mined cryptocurrency approximately tracks the cost of electricity... but I might be misremembering that detail.

tonyedgecombe an hour ago | parent | prev [-]

I doubt it because you aren't going to get the utilisation that a commercial setup would. No point wasting tons of money on hardware that is sat idle most of the time.

zozbot234 14 minutes ago | parent [-]

If you're running agentic workloads in the background (either some coding agent or personal claw-agent type) that's enough utilization that the hardware won't be sitting idle.

echelon 11 minutes ago | parent | prev [-]

Y'all aren't getting it.

- Our career is reaching the end of the line

- 99.9999% of users will be using the cloud

- if we don't have strong open source models, we're going to be locked into hyperscaler APIs for life

Why are you building for hobby uses?

Build for freedom of the ability to make and scale businesses. To remain competitive. To have options in the future independent of hyperscalers.

We're going to be locked out of the game soon.

Everyone should be panicking about losing the ability to participate.

Play with your RTXes all you like. They might as well be raspberry pis. They're toys.

gessha 3 hours ago | parent | prev [-]

Man, going to personal computing was a mistake, we should’ve stayed jacked to the mainframes /s