Remix.run Logo
Hold on to Your Hardware(xn--gckvb8fzb.com)
280 points by LucidLynx 3 hours ago | 206 comments
BLKNSLVR 7 minutes ago | parent | next [-]

This may not be entirely appropriate to the reasons behind the article, but it feels tangentially related:

I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.

I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.

I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.

stronglikedan 2 minutes ago | parent [-]

> I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future

Me too, but without all the slavery this time please. It'll never work if some actors are willing to abuse their workforces to keep prices low as they do.

saadn92 11 minutes ago | parent | prev | next [-]

The article's dystopia section is dramatic but the practical point is real. I've been self-hosting more and more over the past year specifically because I got uncomfortable with how much of my stack depended on someone else's servers.

Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up

gregoriol 4 minutes ago | parent [-]

You are missing one important part: maintenance. While on a managed service, dozens of hours of maintenance are done by someone, when you are self-hosting, you'll be doing 3 times that, because you can't know all the details of making so many tools work, because each tool will have to be upgraded at some point and the upgrade will fail, because you have to test you backups, and many many more things to do in the long run.

So yeah, it's fun. But don't under-estimate that time, it could easily be your time spent with friend or family.

barrkel 3 hours ago | parent | prev | next [-]

I don't buy the central thesis of the article. We won't be in a supply crunch forever.

However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.

This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.

I could sell the RAM alone now for the price I paid for it.

Aurornis 2 minutes ago | parent | next [-]

> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.

How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?

Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.

raincole 2 hours ago | parent | prev | next [-]

We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.

People who are willing to drop $20k on a computer might not be affected much tho.

TeMPOraL 2 hours ago | parent | next [-]

> People who are willing to drop $20k on a computer might not be affected much tho.

They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).

m3nu an hour ago | parent | prev | next [-]

My bet is that phone hardware will be used more and more in mini PCs and laptops keeping the cost down and volume up. We see it with Apple and many Chinese mini PC makers I looked at.

bparsons 4 minutes ago | parent | prev | next [-]

The problem is that there is a very large incentive for three large companies to corner the market on computing components, forcing consumers to rent access instead of owning.

molszanski an hour ago | parent | prev | next [-]

> We won't be in a supply crunch forever.

This what always happens in capitalism. Scarcity is almost always followed by glut

drecked an hour ago | parent [-]

I don’t believe we are seeing the investments necessary that would indicate this will happen.

Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.

Thats likely because they don’t expect this demand to last past a few years.

fmajid 44 minutes ago | parent | next [-]

They have seen boom and bust cycles previously and are understandably wary of expanding capacity for expected demand that may fizzle. If they stay too conservative, China’s CXMT is chomping at the bit to eat their lunch, backed by the Chinese government, but that’s not going to help until late 2027 at best.

bitmasher9 an hour ago | parent | prev [-]

If the demand lasts for a few years, I’m doubtful that all of the consumer capacity will come back.

close04 34 minutes ago | parent | prev [-]

> We'll have a demand crunch

This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.

I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.

Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.

HKH2 15 minutes ago | parent [-]

> I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform.

Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.

hombre_fatal 8 minutes ago | parent | next [-]

Yeah, this gamer conspiracy theory never made sense to me.

Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.

But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.

foobarian 11 minutes ago | parent | prev [-]

I love it when I get my Robloxhead daughter to test drive some of the games I play on my 5090 box. "Ooooh these graphics are unreal" "Can we stop for just a moment and admire this grass" :-D

kace91 2 hours ago | parent | prev | next [-]

The thing is, other than AI stuff, where does a non powerful computer limit you?

My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.

I'm not arguing mind you, just trying to understand the usecases people are thinking of here.

zozbot234 2 hours ago | parent | next [-]

> other than AI stuff, where does a non powerful computer limit you?

Running Electron apps and browsing React-based websites, of course.

tormeh 2 hours ago | parent | next [-]

For real. Once I've opened Spotify, Slack, Teams, and a browser about 10GB of RAM is in use. I barely have any RAM left over for actual work.

foobarian a few seconds ago | parent | next [-]

I keep wondering why we can't have 2000s software on today's hardware. Maybe because browsers are de facto required to build apps?

skydhash 2 hours ago | parent | prev [-]

That’s why I only run those on work computers (where they are mandated by the company). My personal computers are free of these software.

lpcvoid an hour ago | parent [-]

I rarely doge a chance to shit on Microslop and its horrible products, but you don't use a browser? In fact, running all that junk in a single chromium instance is quite a memory saver compared to individual electron applications.

bluGill 28 minutes ago | parent | next [-]

I use a browser at home, but I don't use the heaviest web sites. There are several options for my hourly weather update, some are worse than others (sadly I haven't found any that are light weight - I just need to know if it would be a thunderstorm when I ride my bike home from work thus meaning I shouldn't ride in now)

TeMPOraL a few seconds ago | parent | next [-]

I'm giving up on weather apps bullshit at this point, and am currently making myself a Tasker script to feed hourly weather predictions into a calendar (synced from the phone up to the cloud) so I can see it displayed inline with calendar events on my calendar and most importantly, my watch[0] - i.e. in context it actually matters.

--

[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.

lpcvoid 23 minutes ago | parent | prev [-]

Try Quickweather (with OpenMeteo) if you're on Android. I love it.

skydhash an hour ago | parent | prev [-]

Why would I need a browser to play music? Or to send an email? Or to type code? My browser usage is mostly for accessing stuff on someone else’s computer.

lpcvoid an hour ago | parent [-]

The only subscription I have is Spotify, since there's no easy way that I know of to get the discoverability of music in a way that Spotify allows it.

For the rest: I agree with you.

teeray an hour ago | parent | prev | next [-]

Companies love externalizing the costs of making efficient software onto consumers, who need to purchase more powerful computing hardware.

vladvasiliu an hour ago | parent [-]

If only. At work I've got a new computer, replacing a lower-end 5-yo model. The new one has four times the cores, twice the RAM, a non-circus-grade ssd, a high-powered cpu as opposed to the "u" series chip the old one has.

I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.

sfn42 35 minutes ago | parent [-]

Yeah people love to shit on electron and such but they're full of crap. It doesn't matter one bit for anything more powerful than a raspberry pi. Probably not even there. "Oh boo hoo chrome uses 2 gigs of ram" so what you have 16+ it doesn't matter. I swear people have some weird idea that the ideal world is one where 98% of their ram just sits unused, like the whole point of ram is to use it but whenever an application does use it people whine about it. And it's not even like "this makes my pc slow" it's literally just "hurr durr ram usage is x" okay but is there an actual problem? Crickets.

interf4ce 9 minutes ago | parent | next [-]

The issue isn't usage, it's waste. Every byte of RAM that's used unnecessarily because of bloated software frameworks used by lazy devs (devs who make the same arguments you're making) is a byte that can't be used by the software that actually needs it, like video editing, data processing, 3D work, CAD, etc. It's incredibly short sighted to think that any consumer application runs in a vacuum with all system resources available to it. This mindset of "but consumers have so much RAM these days" just leads to worse and worse software design instead of programmers actually learning how to do things well. That's not a good direction and it saddens me that making software that minimizes its system footprint has become a niche instead of the mainstream.

t;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.

gryfft 32 minutes ago | parent | prev | next [-]

"chrome uses 2gb of ram"

these days individual _tabs_ are using multiple gb of ram.

vladvasiliu 22 minutes ago | parent | prev [-]

I think it's a correlation vs causation type thing. Many Electron apps are extremely, painfully, slow. Teams is pretty much the poster child for this, but even spotify sometimes finds a way to lag, when it's just a freaking list of text.

Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.

Esophagus4 2 hours ago | parent | prev | next [-]

It seems like as hardware gets cheaper, software gets more bloated to compensate. Or maybe it’s vice versa.

I wonder if there’s a computer science law about this. This could be my chance!

mjmas an hour ago | parent | next [-]

Is your name Wirth?

daveguy an hour ago | parent | prev [-]

Sorry to burst your bubble:

https://en.wikipedia.org/wiki/Wirth%27s_law

Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.

mettamage 2 hours ago | parent | prev [-]

This is the way

mbfg 12 minutes ago | parent | prev | next [-]

I've never have a personal computer that came even close to powerful enough to do what i want. Compiles that take 15 minutes, is really annoying for instance.

WillAdams an hour ago | parent | prev | next [-]

3D CAD/CAM is still CPU (and to a lesser extent memory) bound --- I do joinery, and my last attempt at a test joint for a project I'm still working up to was a 1" x 2" x 1" area (two 1" x 1" x 1" halves which mated) which took an entry-level CAM program some 18--20 minutes to calculate and made a ~140MB file including G-code toolpaths.... (really should have tracked memory usage....)

swiftcoder an hour ago | parent | prev | next [-]

The big one for me is ballooning dependency trees in popular npm/cargo frameworks. I had to trade a perfectly good i9-based MacBook Pro up to an M2, just to get compile times under control at work.

The constant increases in website and electron app weight don't feel great either.

duskdozer an hour ago | parent | prev [-]

>My phone has 16gigs of ram and a terabyte of storage

That's "non powerful" to you?

kace91 13 minutes ago | parent [-]

The opposite. I meant that if this is what consumer grade looks like nowadays, even with a fraction of current flagships we seem well covered - this was less than 800 bucks.

porkeynon 4 minutes ago | parent | prev | next [-]

$20k?

People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.

barrkel 2 minutes ago | parent [-]

Alas, I'm not a young man any more. And my HEDT is headless, it has no monitor with which to play FPSes.

guessmyname 2 hours ago | parent | prev | next [-]

> I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]

768GB of RAM is insane…

Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.

rafaelmn 2 hours ago | parent | next [-]

Your battery is going to suffer because of the extra ram as well.

I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.

zozbot234 2 hours ago | parent [-]

> Your battery is going to suffer because of the extra ram as well.

No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.

cduzz 7 minutes ago | parent | next [-]

I suspect this is one of those "it depends" situations; does the 128gb vs 64gb sku have more chips or denser chips? If "more chips" probably it'll draw a tiny bit more power than the smaller version. If the "denser" chips, it may be "more power draw" but such a tiny difference that it's immaterial.

Similarly, having more cache may mean less SSD activity, which may mean less energy draw overall.

If I had a chip to put on the roulette table of this "what if" I'd put it on the "it won't make a difference in the real world in any meaningful way" square.

3form an hour ago | parent | prev [-]

Given the DRAM refresh is part of S3 standby, I'm afraid this is circular reasoning.

barrkel 2 hours ago | parent | prev | next [-]

Look at the way age gating is going in a global coordinated push. Can control of compute be far behind?

It wasn't my primary motivator but it hasn't made me regret my decision.

I hummed and hawed on it for a good few months myself.

WillAdams an hour ago | parent | next [-]

Just look at ITAR and the various attempts at legislating 3D printing and CNC machining of firearms parts to see one justification point of that.

jusssi 2 hours ago | parent | prev [-]

> Can control of compute be far behind?

How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.

cesarb 23 minutes ago | parent [-]

> How is this going to work? You need uncontrolled compute for developing software.

I've read about companies where all software developers have to RDP to the company's servers to develop software, either to save on costs (sharing a few powerful servers with plenty of RAM and CPU between several developers) or to protect against leaks (since the code and assets never leave the company's Citrix servers).

gzread 2 hours ago | parent | prev | next [-]

> 768GB of RAM is insane.

Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.

You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.

Apple hardware is incredibly overpriced.

FpUser 2 hours ago | parent | prev [-]

My home server has 512GB RAM, 48 cores, my 4 desktops are 16 cores 128GB, 4060GPU each. Server is second hand and I paid around $2500 for it. Just below $3000 price for desktops when I built them. All prices are in Canadian Pesos

xyzsparetimexyz 2 hours ago | parent [-]

Canadian Pesos?

doubled112 2 hours ago | parent | next [-]

Jokes because the Canadian dollar’s value isn’t very high right now.

See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.

A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.

heffer an hour ago | parent | prev | next [-]

Please, it's actually Cambodian Dollhairs or Canuckistan Pesos.

FpUser 2 hours ago | parent | prev [-]

It is sarcasm. Our dollar which used to be on par with US is no more.

motbus3 an hour ago | parent | prev | next [-]

I believe superficially speaking you could be right. But I think it was realised that causing the scarcity of products and commodities is a power move.

We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...

This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale

picture 2 hours ago | parent | prev | next [-]

It seems like you largely agree with the article - people shall own nothing and be happy. Perhaps the artificially induced supply crunch could go on indefinitely.

Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?

Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.

9wzYQbTYsAIc an hour ago | parent | next [-]

Solarpunk + https://permacomputing.net/

That’s for everyone

barrkel 2 hours ago | parent | prev [-]

People spend a lot more than that on a car they use less, especially if they're in tech.

The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.

That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.

sfn42 17 minutes ago | parent [-]

I bought a 4 year old car for significantly less than that. And I can get a computer that can do 99% of what your monster can do for like 10% of the price. And if I want LLM inference I can get that for like $20 a month or whatever.

I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.

kgeist 2 hours ago | parent | prev | next [-]

>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.

Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?

girvo an hour ago | parent [-]

Blackwell diverges within Blackwell itself… SM121 on the GB10 vs the RTX 5000 consumer vs the actual full fat B100 hardware all have surprisingly different abilities. The GB10 has been hamstrung by this a bit, too.

shevy-java 2 hours ago | parent | prev | next [-]

> We won't be in a supply crunch forever.

I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.

It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)

bluGill 25 minutes ago | parent [-]

General predictions are in 3-5 years things will return to normal. 3 years if the current AI crunch is a short term thing, 5 years if it isn't and we have to build new RAM factories.

echelon 2 hours ago | parent | prev [-]

Local is a dead end.

Open source efforts need to give up on local AI and embrace cloud compute.

We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.

When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.

If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.

If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.

Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.

An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.

That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.

zozbot234 2 hours ago | parent | next [-]

> We need open weights models that are big and run on H200s.

We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.

lpcvoid an hour ago | parent | prev | next [-]

> Open source efforts need to give up on local AI and embrace cloud compute.

Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?

nhecker an hour ago | parent [-]

In a similar vein: seek efficiency.

I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)

Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.

zozbot234 5 minutes ago | parent [-]

But commodity hardware that's right-sized for your own private needs is many orders of magnitude cheaper than datacenter hardware that's intended to serve millions of users simultaneously while consuming gigawatts in power. You're mostly paying for that hardware when you buy LLM tokens, not just for power efficiency. And your own hardware stays available for non-AI related needs, while paying for these tokens would require you to address these needs separately in some way.

gessha an hour ago | parent | prev [-]

Man, going to personal computing was a mistake, we should’ve stayed jacked to the mainframes /s

bluejay2387 2 hours ago | parent | prev | next [-]

The general take here seems to be "everything eventually passes". That isn't always true. I wonder how many people have a primary computing device that they don't even have full control over now (Apple phones, tablets...). Years ago the concept of spending over $1k on a computer that I didn't even have the right to install my own software on was considered ridiculous by many people (myself included). Now many people primarily consume content on a device controlled almost entirely by the company they bought it from. If the economics lead to a situation where its more profitable to sell you compute time than sell you computers then businesses will chose to not sell you computers. I have no idea if that is what ends up happening.

threetonesun 8 minutes ago | parent | next [-]

The framing here is wrong, I think. My iPad has a lot of software on it that I use for music production, it all runs locally. Yes I had to install it through Apple's app store but I could disconnect it from the Internet and expect it to, at this point, work as long as the software on almost any piece of hardware it replaces.

Meanwhile my much more expensive laptop mostly interfaces with applications that primarily exist on servers that I have no control over, and it would be nearly worthless if I disconnected it from the Internet. Your central point is right, the economics are concerning, but I think it's been a ship slowly sailing away that we're now noticing has disappeared over the horizon.

cmiles74 an hour ago | parent | prev | next [-]

It's worth keeping an eye on this HP-rental-laptop thing.

Personally I think it will be a big headache for HP, people can be hard on laptops and HP is already not excited about consumer support (i.e. mandatory 15 minute wait time for support calls). But if they make it work, I think there's probably a good number of people who feel like they need a laptop but don't care so much about the specifics and want to keep their costs low (as all of their costs appear to be rising right now).

bluGill 21 minutes ago | parent [-]

Rental seems to be about corporate laptops. Companies just want things to work at a predictable cost. They are already replacing laptops after 5 years even if they work. They are already replacing a few laptops that break in less than that 5 years. In short they are already renting the laptops, they are just paying the price upfront and then using accounting to balance it out. Rental just moves the accounting, but otherwise nothing changes.

For consumers who don't replace their laptops on a schedule it makes less sense.

ozgrakkurt 2 hours ago | parent | prev | next [-]

To be fair the people that have ipad as their only computer device now didn’t have a computer back then

TeMPOraL an hour ago | parent [-]

Not necessarily. Many people grew up with PCs and laptops but now mostly use their phones, because outside of specific jobs or hobbies, everyday computing needs are heavily optimized for mobile-first.

(A large factor here is, obviously, the cloud. With photos, documents, e-mail, IMs, etc. all hosted for cheap or free on "other people's computers", the total hardware demands on the end-user computing device is much less. Think storage, not just RAM.)

It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once. I have a separate company laptop for work, and I occasionally turned on my PC, but it turns out that a foldable phone is good enough to do everything on personal side I'd normally use a laptop for. So here I am, with my primary compute device I don't have full control over - and yes, I'm surprised by this development myself, and haven't fully processed it yet.

bluGill 9 minutes ago | parent | next [-]

In a lot of ways the cloud is better than my personal computer, even if I'm on it.

There is a reason I have a server in my basement - it lets me edit files on my phone (if I must - the keyboard is and screen space are terrible compromises but sometimes I can live with it), laptop (acceptable keyboard and screen), or desktop (great keyboard, large screen); it also lets me share with my wife (I haven't got this working but it can be done). I have nearly always had a server in my house because sharing files between computers is so much better than only being able to work on one (or using floppies). The cloud expands my home server to anywhere in the world: it offloads security on someone else, and makes it someone else's problem to keep the software updated.

There is a lot to hate about the cloud. My home servers also have annoyances. However for most things it is conceptually better and we just need the cloud providers to fix the annoyances (it is an open question if they will)

audunw 20 minutes ago | parent | prev | next [-]

> Not necessarily. Many people grew up with PCs and laptops but now mostly use their phones, because outside of specific jobs or hobbies, everyday computing needs are heavily optimized for mobile-first.

It's a deeply flawed comparison, because many of the things we do with a phone now wasn't something we'd do at all with the computers we grew up with. We didn't pay at the grocery store with a computer, we didn't buy metro tickets, we didn't use it to navigate (well, there was a short period of time where we might print out maps, but anyway..)

When I grew up, I feel like our use of home computers fell into two categories:

1. Some of us kids used them to play games. Though many more would have a Nintendo/Sega for that, and I feel like the iPhone/iPad is a continuation of that. The "it just works" experience where you have limited control over the device.

2. Some parents would use it for work/spreadsheets/documents ... and that's still where most people use a "real" computer today. So nothing has really changed there.

There is now a lot more work where you do the work on services running on a server or in the cloud. But that's back to the original point: that's in many cases just not something we could do with old home computers. Like, my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before, and arguably isn't possible without a server/cloud-based infrastructure.

Phones/tablets as an interface to these services is arguably a continuation of like those old dumb terminals to e.g. AS/400 machines and such.

> It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once.

I do agree, I am in a similar situation.

nhecker an hour ago | parent | prev [-]

Ditto. My personal equipment includes a home server (128GB DDR3 ECC) and a tablet with a keyboard. It's honestly astonishing what you can do without a full-fledged laptop, if you're willing to go through some gymnastics to get there. And it travels light compared to a laptop! (The tablet, that is. Not the headless box. :-))

doom2 an hour ago | parent | prev | next [-]

I'm also very skeptical of "everything eventually passes" as it pertains to hardware prices. Right now, prices are high because supply can't keep up with demand. But if/when supply increases to meet demand or demand decreases, there's no reason for companies to drop prices now that consumers have become accustomed to them.

hombre_fatal an hour ago | parent | next [-]

> there's no reason for companies to drop prices

Competition.

bitmasher9 an hour ago | parent | prev [-]

My primary concern is for next generation hardware.

Will we continue to see steady improvement in top quality CPU/GPUs? Would they even bother releasing consumer versions of ram faster than DDR5?

surgical_fire 13 minutes ago | parent | prev [-]

Being in control of your own computing device was always a niche. The vast majority of people are not interested in computing itself, only in the output. For that majority, this is fine.

The niche is still there, probably as big as it was before. For example, as I grew weary of being subject to services I have little control over, I set up my own home server using a refurbished PC. It has been an amazing journey so far. But I don't think a normie would ever get interested in buying a refurbished Dell, install Debian on it, and set up their own services there.

As long as there is a niche of people interested in buying their own computers, there will be companies willing to fill that niche.

rswail 2 hours ago | parent | prev | next [-]

A long article begging the question when the last paragraph or two countered the panic of the beginning. Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening as the existing producers move to selling to enterprise/hyperscalars.

There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.

The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.

It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.

The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.

zozbot234 an hour ago | parent [-]

> Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening

Yes but these Chinese firms are a tiny share of the overall RAM/SSD market, and they'll have the same problems with expanding production as everyone else. So it doesn't actually help all that much.

bluGill 6 minutes ago | parent | next [-]

The biggest problem in expanding for everyone else is they don't trust the market to exist for long enough to be worth paying for a new factory so they are not investing in it. The Chinese might be small, but they think the market will exist and are investing. Will they be right or wrong - I don't know.

bitmasher9 an hour ago | parent | prev [-]

Chinese firms won’t have the exact same problems as anyone else. Some problems will be the same but not all.

* Chinese firms finance through different banks and investors than current ram producers

* A company with a mission statement of consumer ram won’t have their supply outbid by data centers

* Chinese manufacturing has more expertise in scaling then any other manufacturing culture

upofadown 2 hours ago | parent | prev | next [-]

This article inspired me to look and see what this computer is. Apparently it is a "AMD Athlon(tm) II X2 250 Processor" from 2009. So 17 years old. It has 8 GB of DDR3 memory and runs at 3 GHz. It currently has OpenBSD on it, but at least one source thinks it could run Windows 10.

The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.

1970-01-01 an hour ago | parent | next [-]

Your electricity bill alone could justify the cost of a new computer purchase if you're not shutting that down after every session.

einr an hour ago | parent [-]

65W TDP? Let's say we want to run a PC so we're switching to a newer low-end Ryzen with a 35W TDP and that that's a 30W difference for the whole system. Let's say we're running the system 24/7 and the CPU is pulling its full TDP constantly. Average US residential electricity price is $0.18/kWh.

0.03 kW * 24 h * 365 d * $0.18 = $47.30/year

cjs_ac an hour ago | parent | next [-]

In the UK, residential electricity tariffs are currently capped by the regulator at 27.69p per kWh, resulting in a total yearly cost of £72.77. Much higher than in the US, but still much cheaper than a new PC.

1970-01-01 44 minutes ago | parent [-]

£72.77 is more than enough for a PC: https://www.ebay.co.uk/itm/377057425659

1970-01-01 an hour ago | parent | prev [-]

So $50/yr for 4 years gives you ~$150 with $50 extra for shipping or whatever, which gets you a decent Lenovo M700 Tiny with much better performance in both power and power consumption.

einr 32 minutes ago | parent [-]

I guess. It's hardly an open-and-shut case of "throw your old computer away!" though, especially when this is a worst-case scenario of running a desktop computer at full blast 24/7 without it ever going into sleep mode or being turned off, and when you don't know what the user's needs are. Maybe a mini-PC with basically no expansion just won't really work for them?

SeanAnderson an hour ago | parent | prev [-]

Someone's never tried to locally compile a Rust program. :)

meindnoch 2 hours ago | parent | prev | next [-]

I know this may sound ridiculous, but m-maybe... maybe it's time for us to make software... less bloated?

Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?

guardian5x 2 hours ago | parent | next [-]

Let me be the devils advocate here. Ok, let's say you optimize that TODO list app to only use 16 mb of RAM. What did you gain by that? Would you buy a smartphone that has less RAM now?

cheschire an hour ago | parent | next [-]

It’s the upgrade treadmill you would stop using, and stick to the initial entry device.

TeMPOraL an hour ago | parent [-]

If only there wasn't a security update treadmill forcing everyone to do regular hardware upgrades.

3form an hour ago | parent [-]

Of course, as long as we're in the dreamland, most of these security upgrades do not actually require a hardware upgrade.

layer8 an hour ago | parent | prev | next [-]

It would be nice for browser tabs and apps to reload less often.

TeMPOraL an hour ago | parent | prev [-]

We can't ever escape the market forces? You're right, of course if software gets less bloated, vendors will "value-optimize" hardware so in the end, computers keep being barely usable as they are today.

robinsonb5 an hour ago | parent [-]

This year's average phone is already going to have less RAM than last year's average phone - so anything that reduces the footprint of the apps (and even more importantly, websites) we're using can only be a good thing. Plus it extends the usable life of current hardware.

TeMPOraL 2 hours ago | parent | prev | next [-]

That's crazy talk. What will you ask for next? Add functionality to make apps at least as good/capable as they were in the 1990s and early 2000s? And then? Apps that interoperate? Insane.

More seriously and more ironically, at the same time, we've now reached a strange time where even non-programmers can vibe-code better software than they can buy/subscribe to - not because models are that good, or programming isn't hard, but because enshittification that has this industry rotten to the core and unable to deliver useful tools anymore.

rvz 2 hours ago | parent | prev [-]

Tell that to those who are still using Electron, TypeScript to create bloated desktop apps.

dust42 2 hours ago | parent | prev | next [-]

Just to mention one thing, helium -which is a necessity for chip production- is a byproduct of LNG production. And 20% of that is just gone (Qatar) and the question is how long it will take to get that back. So not only a chip shortage because of AI buying chips in huge volumes but also because production will be hampered.

Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.

adrianN 2 hours ago | parent | next [-]

Fusion fuel is so energy dense that fusion plants will never produce industrially meaningful amounts of helium.

TeMPOraL 2 hours ago | parent | next [-]

Well, as long as they can make electricity too cheap to meter, we can get helium from somewhere. Mine it from LNG sources currently untapped due to EROI < 1, or ship it from the goddamn Moon - ultimately, every problem in life (except that of human heart) can be solved with cheap energy.

kibwen 35 minutes ago | parent | next [-]

The mere existence of proof-of-work cryptocurrencies means that it is impossible to ever have electricity that is "too cheap to meter". Any time electricity prices would fall below the price of mining, that creates a market opportunity that will be filled by more mining. Wasted electricity is the product.

bitmasher9 an hour ago | parent | prev | next [-]

With the trend of orbital launches becoming cheaper, it might be that mining helium off-Tera will be our long term supply. Especially if the alternative is adjusting the amount of protons in an atom.

There are several challenges, not least of which is storage. We have considerable leakage in most of our current helium storage solutions on earth because it’s so light. Our national reserves are literally in underground caverns because it’s better than anything we can build. Space just means any containment system will need to work in a wider range of pressure/temperatures.

adrianN an hour ago | parent | prev [-]

There is to my knowledge no reason to assume that complicated physics experiments that heat water to run a steam engine will be much cheaper than fission power plants, unfortunately.

gobdovan 2 hours ago | parent | prev | next [-]

I think this is why he labelled the comment 'Tongue in cheek'. Thanks for pointing it out explicitly tho, was not aware of this.

numpad0 an hour ago | parent | prev [-]

Can't they irradiate tanks of H2 or something with so much neutrons and electrons until morale improves and they become He? Or would that make radioactive He?

Arn_Thor 2 hours ago | parent | prev [-]

Considering my helium-filled hard drives a strategic reserve now

TeMPOraL 2 hours ago | parent | next [-]

Gonna sit on my half-empty tank for party balloons from my daughter's birthday, maybe we'll be able to sell it to pay off mortgage quicker than the helium itself escapes the tank.

halapro an hour ago | parent [-]

Same energy as "buy bitcoin" in 2011

TeMPOraL an hour ago | parent [-]

Unfortunately bitcoins don't leak from storage tanks on their own.

myself248 an hour ago | parent | prev [-]

That's another lifetime-limited thing -- the helium leaks out, and you cannot (for practical purposes) stop it or even meaningfully slow it down. When it's gone, the drives are dead. And the helium leaks by calendar-days, it doesn't matter whether the drive is powered on or off.

Non-helium hard drives are basically limited by their bearing spin hours. If one only spins a few hours a week, it'll probably run for decades. Not so with helium.

adrianN an hour ago | parent [-]

You just have to put your hard drive in a pressure vessel filled with helium.

Forgeties79 an hour ago | parent [-]

It’s helium all the way down

Bender 22 minutes ago | parent | prev | next [-]

It is a good article but I am holding onto my hardware for other reasons. I predict it will not be long until all hardware has a set of Nanny chips that are named and marketed so that even people here on HN will argue on behalf of having them. It will be some "Secure enclave AI accelerated Super Mega Native Processing Underminer" and will start off securing and accelerating something or a set of somethings but will eventually tie into age verification, censorship and a Central Nanny Agency that all countries will obey.

- "Stare into this hole to verify your age.

- "Stick your finger in the box.

- "Sync ALL of your passwords into this secure enclave."

- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation internet."

Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.

My only question is which country will control the behavior of these chips.

the__alchemist 20 minutes ago | parent | prev | next [-]

It is wild thinking how a few years ago, I didn't buy a 4090 direct from nvidia because "$1600 (USD) is too much to pay for a graphics card; if I need a better one, i'll upgrade in a few years. (Went with 4080, which is substantially slower and was $1200) Joke's on me!

It will be scarcity mindset from here on out; will always buy the top tier thing .

2716057 an hour ago | parent | prev | next [-]

As long as there are consumers paying for hardware ownership there will be businesses willing to sell it to them. The worst scenario I could imagine is that one has to pay a premium for fully-owned hardware simply because consumer's desire for it becomes an oddity and it is thus sold in low quantities.

The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.

CraigJPerry 3 hours ago | parent | prev | next [-]

Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.

Is that likely? History says it's inevitable, but timeframe is an open question.

Shank 2 hours ago | parent | next [-]

> ai infrastructure firms cease to be able to secure more capital

If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.

lugu 2 hours ago | parent | next [-]

> If this does occur

The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.

CraigJPerry 2 hours ago | parent | prev | next [-]

yeah 3 years sounds reasonable to me, less than one asset depreciation cycle in business. Pain for you and me, but just a bump in the road for the accounts dept.

gzread 2 hours ago | parent | prev [-]

My computer, and I think all threadripper systems, has registered ECC DDR5 RAM which I think is the same type used in AI datacenters. Well one half of it, the other half being HBM memory used on video cards, which is soldered to them and non-upgradeable. But the main system memory from a used AI server can become your main system memory.

myself248 an hour ago | parent [-]

So that becomes the next question -- will we see an ecosystem of modifications and adapters, to desolder surplus and decommissioned datacenter HBM and put it on some sort of daughterboard with a translator so it can be used in a consumer machine?

Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...

duskdozer 20 minutes ago | parent [-]

Aren't some people already doing this with consumer GPUs?

sva_ 2 hours ago | parent | prev | next [-]

I think some players like xAi and Google can burn money for a long time. Google made $240B profit last year

ramon156 2 hours ago | parent | prev [-]

They would rent out the data centers, not sell it off

adamwong246 27 minutes ago | parent | prev | next [-]

I have often imagined writing a book, roughly "Fahrenheit 451 but with computers instead of books". Imagine a world you do not buy an iPhone- one is assigned to you at birth, a world were "installing software" on "a computer you own" are not just antiquated or taboo, but unthinkable.

red-iron-pine 24 minutes ago | parent [-]

google and facebook are actively salivating at this possibility.

"don't create the torment nexus, etc."

anonzzzies 2 hours ago | parent | prev | next [-]

I do not see this from an infinite shortage point; I see this from a locked down hardware point. Old hardware is hackable, new hardware mostly not. That is for me where the real pain is and why I just buy old computers and phones that are rootable.

darkwater an hour ago | parent | prev | next [-]

> For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.

I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.

The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?

tmtvl 2 hours ago | parent | prev | next [-]

I grabbed an upgrade at the end of last year because my ~10 year old workhorse is starting to show signs of aging. Despite 16 gigs of RAM having lasted me thus far I decided to bite the bullet and get 32; so I expect this new machine to last me another 10 years (although I now have a full SSD, whereas my old workhorse had an SSD for the OS and a hybrid drive for /home, so we'll see whether or not it will actually last).

Forgeties79 an hour ago | parent [-]

Built my PC last April and also did 32gb. Almost did 64 since ram was so cheap at the time too, but hey live and learn I suppose ha

vladde an hour ago | parent | prev | next [-]

when you click away to another tab, the title and favicon of the page changes to something weird, but really legit looking.

a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"

foolserrandboy 32 minutes ago | parent [-]

It actually gives you warning in an overlay first that the favicon would change if you open a new tab. I did and I got "zuckerberg nudes"

mememememememo 3 hours ago | parent | prev | next [-]

In such a future the iPhone and android ecosystem is dead? Because a single $1k phone is a hell of a computer. So if you can still buy a phone you can still get a computer. Local AI aside these are very capable.

layer8 27 minutes ago | parent | next [-]

You don’t really own an iPhone in terms of being a computer. It’s different for certain Android phones where you can install a custom OS. Those are also less powerful, however.

pjc50 2 hours ago | parent | prev | next [-]

iOS is apparently going to have mandatory age gating, so likely that will come to Android as well.

mememememememo 2 hours ago | parent [-]

I was trying to avoid the software side of this argument as it is a worm cannister. I was just musing from a hardware availability point of view.

That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.

Until they come for the HW.

jagged-chisel 3 hours ago | parent | prev [-]

Maybe we’ll finally get some good tools to make real productive work possible on phones.

Angostura 3 hours ago | parent | next [-]

Like a large screen and a keyboard? Hello Mac Neo

kingleopold 2 hours ago | parent | next [-]

I see they are offering to macos for iphone pro and ipad pro next years with subsc. ? or via upgrade with price I mean it's now possible more than ever

mememememememo 2 hours ago | parent | prev [-]

Hello HDMI adaptor and magic keyboard

9wzYQbTYsAIc 3 hours ago | parent | prev [-]

Virtual desktop casting, a killer HID product, and what else do you need?

abmmgb an hour ago | parent | prev | next [-]

I actually think the central thesis is thought provoking, we have shifted far away from locally installed shit to remote data centre access, this was initially driven by cloud-based initiatives and now spiralling upwards by AI. For any researchers, hackers, builders wanting to play with locally installed AI, hardware could become a bottleneck especially as many machines, such as the beloved Macs, are not upgradable

altcognito 18 minutes ago | parent | prev | next [-]

Part of this is that memory companies recognize that nobody is going to enforce antitrust law for the forseeable future, so collusion to raise prices is the norm now.

pmdr an hour ago | parent | prev | next [-]

I've seen comments on here before that went somewhere along the line of "adults don't care about RAM prices." HN is no stranger to siding with the oppressors.

xbmcuser an hour ago | parent | prev | next [-]

In the last month 20-30% of oil supply 30% gas supply and 30-40% of fertilizer production has been destroyed and could take any where from 8 months to 5 years to come back online. Governments are acting as everything is okay so that there is no panic but we have crossed the point of no return even if the war ends today food & energy shortages are over the horizon. If you can get an ev, solar heat pumps, battery storage etc get it now today as fossil fuel based energy prices are going to go through the roof. I see similarities to when covid hit people kept looking at things happening in other countries and not preparing for the shit to hit their own cities and countries.

jleyank 2 hours ago | parent | prev | next [-]

Hold onto your hardware. Hold on to your existing software and the current version. Don’t upgrade without a specific need. None of the “progress” is actually helpful to hackers and I’m not sure it’s even helpful to typical users. There’s enough information being given to and slurped by others, don’t make it more effective.

archargelod 2 hours ago | parent | next [-]

My PC has an Intel Xeon from 2007, a GPU from 2010, and 4GB of RAM. It’s enough for web browsing and can handle 1080p/60fps video just fine.

For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.

Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.

It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.

compounding_it 2 hours ago | parent | prev [-]

In order to go from 360p video 15 years ago to 4K HDR today, I have upgraded from a 2mbps 802.11g WiFi on a 1366x768 display to a 200mbps connection on 802.11ax and a 55 inch 4k television.

The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).

sigio 2 hours ago | parent [-]

At the same time, we had cheap consumer gigabit ethernet, and still have cheap consumer gigabit ethernet. 2.5 is getting there price-wise, but switches are still somewhat rare/expensive.

duskdozer 3 hours ago | parent | prev | next [-]

    uBlock Origin has prevented the following page from loading:

    https://xn--gckvb8fzb.com/hold-on-to-your-hardware/

    This happened because of the following filter:

    ||xn--$document
    The filter has been found in:   IDN Homograph Attack Protection - Complete Blockage
sva_ 2 hours ago | parent | next [-]

What a silly filter, blocking all xn domains

zvqcMMV6Zcr 2 hours ago | parent | next [-]

That whole feature is kind of paragraph 22. No legit/popular site uses it so users don't expect national characters in domain names, so no one actually hosts sites using "xn-" domains.

shusaku 2 hours ago | parent | prev | next [-]

Kind of an interesting history to this kind of url: https://www.nic.ad.jp/ja/dom/idn.html

duskdozer 2 hours ago | parent | prev [-]

Shrug. First time I'd seen this. If it displayed as the original text it would have been clearer.

numpad0 an hour ago | parent | next [-]

It would make it hard to spot impostor domains like "news.усомbiнаtor[.]сом" if it was. There's enough inertia for FQDNs to be strictly ASCII and any UTF-8(outside ASCII) in domain names to be felt unnatural for an URL, so most systems default to the raw "Punycode" xn-- scheme for all IDNs.

httpsterio 2 hours ago | parent | prev [-]

In this case yes but it's meant as a punycode scam prevention where common Latin alphabet letters are swapped for similar looking alternatives.

lexlambda an hour ago | parent | prev [-]

This is a filter added by you (or by an overzealous list maintainer), it does not happen by default or even with the provided additional filterlists.

pcblues 31 minutes ago | parent | prev | next [-]

I think what many people don't realise is that there will be a glut of cheap computer parts including CPUs, GPU cards, and memory when the AI and AI-adjacent businesses go bust and a bunch of data centres get pulled down.

pjmlp 35 minutes ago | parent | prev | next [-]

I have been holding for my hardware for decades, some of my private hardware traces back to 2009.

Phones and tablets only get replaced when they die.

Why should I throw away stuff that still works as intended?

rolandhvar an hour ago | parent | prev | next [-]

So what happens when the datacenters need to upgrade (new hardware, or stupid enterprisey reasons like "must be new when replacing broken stuff")? Surely there remains a secondary market for the enthusiasts?

shusaku 2 hours ago | parent | prev | next [-]

> These days, the biggest customers are not gamers, creators, PC builders or even crypto miners anymore. Today, it’s hyperscalers. … > These buyers don’t care if RAM costs 20% more and neither do they wait for Black Friday deals. Instead, they sign contracts measured in exabytes and billions of dollars.

Does all this not apply to businesses buying computers for their employees?

grahammccain 35 minutes ago | parent | prev | next [-]

I feel like we will get out of the hardware constraints eventually.

vjerancrnjak 3 hours ago | parent | prev | next [-]

haha, all of a sudden I see a tab "waifu pillow" on Amazon, and think I have a split personality that runs searches in between consciousness shifts, and then I come back to a funny message.

mmackh 2 hours ago | parent | prev | next [-]

We are in a renaissance of computing right at this moment. If expand our definition of computers outside of screens and traditional input devices, microcontrollers are capable of so much more, with so much less (energy consumption | ram | storage).

The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.

So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!

functional_dev 13 minutes ago | parent [-]

you are right! Power management improvements are what really enable these form factors... being able to run a wifi sensor on a coin cell for a year makes applications possible that were unthinkable just a few years ago

aurareturn 2 hours ago | parent | prev | next [-]

Capitalism at work. There is more value to be generated by moving resources to data centers for the moment. This isn't some me be insensitive or anything. It's the same people who are buying iPhones and PCs who are demanding more compute for AI.

There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.

Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.

usrbinbash 2 hours ago | parent | prev | next [-]

As the old saying goes: "This too will pass."

Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.

If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.

https://www.youtube.com/watch?v=SrX0jPAdSxU

selectively 21 minutes ago | parent | prev | next [-]

This is just brainrot garbage. The idiotic stuff you see YouTubers saying. Why is this at the top of HN? Bots, I assume?

camgunz 3 hours ago | parent | prev | next [-]

I feel like this is just the bubble talking. I'm pretty naive here, but at some point suppliers will adjust so they can take money from data center builders and consumers, just like pre-bubble.

not_the_fda 3 hours ago | parent | next [-]

Chip manufacturers are used to boom bust cycles and are always hesitant to bring on more capacity, since it costs billions to do so.

They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.

Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.

Tade0 an hour ago | parent | next [-]

I recall how in the 2010s RAM manufacturers were in crisis, as their margins were low and competition fierce - it got to a point where they started doing price fixing and got fined for it:

https://web.archive.org/web/20180513133803/https://www.techr...

Prices went down again after that.

To me this is just a temporary swing in the other direction - they're riding the gravy train while they can, because once it ends it's back to low prices.

9wzYQbTYsAIc 2 hours ago | parent | prev [-]

> Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.

At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.

I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.

XorNot 3 hours ago | parent | prev [-]

It's probably closer to the suppliers don't think this will last and are ramping slowly if at all so they're not left holding the bag.

The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.

dade_ 44 minutes ago | parent | prev | next [-]

https://www.photonics.com/Articles/Nortel-Completes-Acquisit...

Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?

"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."

flyinglizard 2 hours ago | parent | prev | next [-]

It's a thought provoking article and I felt the pain when I shopped around for a new GPU lately to replace a 4090 I thought was faulty (eventually a cleaning of the PCIe connector solved those crashes). I bought it at the end of 2022 and three and a half years it seems like we've gone backwards, not forward on GPUs available for end users. They cost more and do less.

But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.

Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.

Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.

the__alchemist 15 minutes ago | parent [-]

Yea; I believe this is unprecedented. This is the firs time I've observed this regression in GPU price/ performance. That 4090 is still top-tier, and now costs more than when it was new.

jagermo 2 hours ago | parent | prev | next [-]

I knew the time for my cable box would come!

lmz 2 hours ago | parent | prev | next [-]

Micron is killing its Crucial consumer brand, not supplies to consumer brands who use its chips. Hynix never had a consumer brand for RAM I don't think?

xyst 12 minutes ago | parent | prev | next [-]

Not a single comment mentioning how programmers these days don’t give a shit about optimization.

defraudbah 2 hours ago | parent | prev | next [-]

I refuse, I'll buy when I need to and can hold on for a few months if prices become insane. This means I'll spend less on hardware then what I could, if I wanted to buy max mpro or latest framework I just will not, because prices are too mad and g o for a cheaper version.

whatever happens it's crazy and hope AI madness is worth it

sigio 2 hours ago | parent [-]

For laptops, I always spring for the lowest amount of ram + hdd/ssd, and then instantly upgrade this from local after-market sources. However, this wouldn't work for apple devices (Hence I don't own any Apple devices).

For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.

shevy-java 2 hours ago | parent | prev | next [-]

AI companies driving RAM prices up is, in my opinion, theft from the common man (and common woman). Sure, you can say that in capitalism, those who pay more benefit the most, but no system, not even the USA, has a purely driven capitalistic system. You still have transfer money, public infrastructure and what not. So private companies driving up the prices, such as for RAM, is IMO also theft from common people. And that should not happen. It can only happen when you have lobbyists disguised as politicians who benefit personally from helping establish such a system. The same can be said about any other prive-upwards scaling that is done via racketeering.

cynicalsecurity 2 hours ago | parent | prev | next [-]

Fear mongering hysteria.

pissinwind an hour ago | parent | prev | next [-]

Everything about tech and economy slowing is 1000% man made.

The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.

Even if we have to wait for this old world cabal to die and fade away, time is still on our side.

Boomers are stupid for using time as a weapon.

I’m chillin. Waiting for people to die while growing my businesses.

Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.

dist-epoch 2 hours ago | parent | prev | next [-]

I'm not sure why people are upset. This is how Capitalism is supposed to work - resource allocation towards the most productive (in terms of Capital) usage.

Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.

This is pure Capitalism.

If one is in general against Capitalism, yes, one can complain.

But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.

Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.

And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.

Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.

AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.

forinti 2 hours ago | parent [-]

The market can remain irrational longer than the capacitors on my motherboard can resist bloating.

keybored 3 hours ago | parent | prev | next [-]

Owning hardware is great. But I get the impression that some people view owning petty hardware as some liberty panacea.

You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.

Edit: That’s an example. It goes beyond AI. and...:

Liberty goes beyond that.

falense 2 hours ago | parent [-]

I disagree. There is in fact a non-zero chance that we will get good enough models that are MOE optimized for desktop size hardware that can do a lot of the same things as the SOTA models. Im certainly crossing my fingers that the open-weights models continue improving. Engram from Deepseek for instance seems very interesting for a compute to memory offloading perspective.

https://www.reddit.com/r/LocalLLaMA/comments/1s0czc4/round_2...

imtringued 2 hours ago | parent | prev [-]

I just realized that this blog site is pretending to be malware. I opened the tab and was constantly switching between the blog and writing this HN comment (I deleted the rest of the comment after realizing it) and was wondering where the tab went and kept opening it over and over again, then I realized that it completely rewrote the tab title with NSFW content (one of the title contained the world "nudes" with a faked amazon favicon) and when you reopen the tab, it shows you a black overlay with a message intended to induce shock if you ever bother to read it (I didn't read past the first sentence so I don't know what it was actually about).

Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.

fishbacon 2 hours ago | parent | next [-]

The pop-up is about disabling javascript, to avoid this kind of website doing this kind of thing.

I thought it was clever. But it also seems ham-fisted, and in poor taste.

highmango 2 hours ago | parent | prev | next [-]

The whole point is to grab your attention and bully you to turn off JavaScript. It links to another page: https://disable-javascript.org/

I opened the tab on my work laptop and having NSFW title and icon in the office is unacceptable, I understand the intent but the implementation and this way of forcing people to do something is ridiculous. I do not own or control this machine, I trust the links of the frontage of HN to be somewhat safe and not put me in an uncomfortable position. Yes, the site not necessarily malware but a dark pattern and that’s not how you teach your average day-to-day user.

jaen 2 hours ago | parent | prev [-]

It doesn't write anything extra to the browser history. How about actually checking before exaggerating. If you are bothered by a single wrong title with the right URL, well... I think something else is wrong.

You are also completely speculating on the intent. Less drama please.