Remix.run Logo
psyclobe 5 hours ago

I have always envisioned a ai server being part of a family's major purchases e.g. when they buy a house, appliance, etc. they also buy a 'ai system'.

Machine hardware evolution is slowing down, pretty soon you can buy one big ass server that will last potentially decades as it would be purpose built for ai.

Things like 'context based home security' yeah thats just, automatic, free, part of the ai system.

Everyone will talk to the ai through their phones and it'll be connected to the house, it'll have lineage info of the family may be passed down through generations etc, and it'll all be 100% owned, offline, for the family; a forever assistant just there.

nateb2022 5 hours ago | parent | next [-]

I disagree. Let's take the M1 vs the M5 (https://www.macrumors.com/2025/11/10/apple-silicon-m1-to-m5-...):

  - 6× faster CPU/GPU performance
  - 6× faster AI performance
  - 7.7× faster AI video processing
  - 6.8× faster 3D rendering
  - 2.6× faster gaming performance
  - 2.1× faster code compiling
Over the span of 5 years.

Plus, realistically what makes an "ai" server different from a computer? This "lineage info of the family may be passed down through generations" sounds nice but do you know anyone passing down a Commodore 64 or Apple II that remains in daily use? I fail to see how "ai" would protect something from obsolescence.

Melatonic 3 hours ago | parent | next [-]

It doesnt matter if computers keep getting faster - it just matters if eventually they get to the point where everything is good enough for good AI.

That being said I feel like were gonna get to that point for most other stuff way sooner than AI (and already have for many pieces of software)

vercaemert 2 hours ago | parent [-]

I love this conundrum.

I have a good analogy. 10 years ago, I was convinced that a 24-inch 1080p monitor at arm's length was perfection. There could never be any reason to improve over it. I could do everything I ever wanted to, to a standard I would never need to improve upon.

Yet here we are. The simplest and most obvious improvement is a 24" 4k monitor at 200% scaling. Basically, better in every way.

There's a discussion to be had about whether you need the better setup, which I think is your point, but there's no denying you'd want it (all other variables the same).

spiderfarmer 43 minutes ago | parent [-]

At some point specs don’t matter. I don’t wonder about the processor in my thermostat either. I don’t know how many horsepower my XC90 has. I don’t know the rated power of my chainsaw.

All I care about is: do they work, are they ‘safe’, are they comfortable, etc.

BearOso 5 hours ago | parent | prev | next [-]

That first bullet is a bit sketchy. Benchmarks, particularly geekbench, may have increased 6x, but that's being manipulated.

The GPUs have become much larger, so 6.8x is believable there, as is the inclusion of a matmul unit boosting AI.

The 2.x numbers are the most realistic, especially because they represent actual workloads.

majormajor 5 hours ago | parent [-]

Even the geekbench numbers from the link only ~doubled. For both single- and multi-core CPU and Metal GPU.

omgwtfbyobbq 2 hours ago | parent | prev | next [-]

It depends on what/how you're comparing. Core to core, according to CPU benchmark, the M1 is 5800 vs the M5 at 3600, so we're still not quite to 2x.

Overall system performance is better at about 2x improvement thanks to extra cores/other improvements/changes. I could see other more specialized benchmarks improving more thanks to different improvements/core/power/size improvements in other components (GPU/NPU/etc...).

psyclobe 4 hours ago | parent | prev [-]

Today, not much differentiates them. But as time passes our only option will be to further specialize the hardware to get realistic gains; at some point perhaps a 'purpose built analog' computer kinda thing will get to the point where it is so useful, that it would be like the 'Standard Template Constructs' concept in Warhammer 30k. So what you can make a faster ai but, the current one can 'teach everyone, basically anything'.

zamadatix 5 hours ago | parent | prev | next [-]

If you bought a big ass server for your home 10 years ago it probably wouldn't have even have had a GPU/AI accelerator at all. If it did, it would have been something with wimpy compute and VRAM because you needed the video encoder/decoder for security cameras or the like.

I'm not sure that really gives confidence hardware has really slowed down enough to invest in it for decades. Single core CPU performance has but that's not really what new things are using.

camdenreslink 5 hours ago | parent | next [-]

It really just depends on if the hardware is "good enough" for whatever its purpose is. If the hardware today can locally run whatever models for your security cameras, it's likely they will still be "good enough" in 10 years.

Of course, similar to a 10 year old car or appliance, you will be missing any new features or bells and whistles that have become available in the meantime.

wtallis 5 hours ago | parent [-]

I agree; it's important to recognize that there are lots of use cases where computers have long since reached "good enough" and aren't really going obsolete anymore for those use cases.

My NAS is about 13 years old, the network switches it connects through are even older, and while 2.5GbE now exists I have no need throw out my "good enough" equipment to replace with something marginally faster or more power efficient. I don't even really need to expand the storage of that NAS anytime soon, because my music collection could never come close to filling it, my movie/TV collection isn't growing much anymore due to the shift to streaming, and the volume of other stuff that I need to back up from my other computers just isn't growing much over the years.

kennywinker 4 hours ago | parent | prev | next [-]

You’re kindof undermining your own point. Ten years later the only thing you’d need to upgrade for your home server might be the GPU - because a new use-case emerged. Okay? Spend $500-$1000 on an eGPU. Problem solved. Will that eGPU setup last another ten years? If all it’s doing is processing security video and routing claw-like tasks, then yes.

zamadatix 3 hours ago | parent [-]

Not sure I follow why - that the server from 10 years ago would be completely unfit for purpose now should not imply the one you buy today would therefore be the right hardware 10 years from now. Unless you can somehow guarantee we've reached the final set of new requirements we will ever have just these last few years the GPUs you buy today will probably be just as irrelevant to the new requirements a decade from now.

Of course one can always upgrade components piecewise as requirements change, but I don't see why you need to invest in a big ass server to do that. It'd be cheaper to go that route everyone has for decades at this point - upgrade with normal sized stuff as needed and not try to make it an up front multi-decade home investment out of it.

On the flip-side, if you intentionally plan to lock in the capabilities to the kinds of things one can run today and know you'll never therefore need to upgrade it then you can get whatever sized system makes sense for today's needs. You just need to be really sure you'll not be interested in "the next big thing" when it comes too.

majormajor 5 hours ago | parent | prev | next [-]

Decades is a long time for hardware, but "years" seems reasonable soon. The commercial models are "good enough" for a lot of things now, so if that performance makes its way into the on-device space for "home applicance"-level cost (<$5k at the start, basically), I'd expect a lot of stuff to start popping up there. In offices too.

Like the PC in the 80s starting to eat up "get a mainframe" or "rent time on a mainframe" uses.

psyclobe 4 hours ago | parent | prev [-]

Yeah but, how long do mainframes last? Think of the COBOL systems used in government. No reason to update them, they worked forever; their job is discrete and they performed it well enough where intense updating wasn't a requirement.

icedchai 4 hours ago | parent [-]

You also need to ask: How much do mainframes cost? They were engineered for backwards compatibility and reliability, with built in redundancy you don't find in consumer hardware.

AI models are changing every other day. I have to rebuild llama.cpp from source regularly. We are no where close to a personal "AI mainframe."

beoberha 5 hours ago | parent | prev | next [-]

I don’t think there’s anything different between what you’re suggesting and a homelab. Most people do not have a homelab and are happy to offload services like photo storage or security to remote providers.

sbarre 5 hours ago | parent | next [-]

I think that attitude is (very) slowly changing though and might not be the default forever.

My elderly parents have asked me about "local backups" of their cloud stuff, their Facebook history etc..

If they're thinking about the risks/tradeoffs of being in the cloud..

I think people use the cloud because there's no better/easier option today.

But at some point there might be. A home appliance (which may be similar to a homelab under the hood but the user experience is where things change) that provides a bunch of automation and home services could be quite attractive if it got to a point of being very turnkey for the average family.

Just like a TV or a gaming console is today.

beoberha 5 hours ago | parent [-]

There’s no better option today because it’s impossible to make it a better experience. That machine at home will need upgrades, it could fail, it costs thousands, it sucks lots of power. There is no mass market appeal.

psyclobe 4 hours ago | parent | prev | next [-]

I'm thinking 'everyone needs an air conditioner', kinda need. Instead of 'some nerds run servers'. And this 'ac' is your 'ai'.

Maybe even subsidized by the government. This will be a fundamental need.

dminik 2 hours ago | parent [-]

Hard to make an AC 500km away cool down my home. An AI doesn't really need to be in my flat. Not that I have the space for a server rack anyways.

nateb2022 5 hours ago | parent | prev | next [-]

Strongly agree. Plus, for all but very specific usecases, most people will spend less money by paying for cloud services, with "most" here referring to the general population.

j45 5 hours ago | parent | prev [-]

Home labs feel wholly different and requires custom setup and maintenance.

A home appliance like a toaster would be in the case of an AI server are ready to go appliance that’s preloaded and confined and connect to everything in your home and help you manage it likely by just voice chat or some amount of interface.

beoberha 5 hours ago | parent [-]

What you’re describing is more likely to manifest as a proprietary product from someone like Samsung or Ring (likely both!) than an open standard AI server that integrates with everything in your home automatically. This is exactly like what we have today with security systems and smart appliances. You have managed services and you have Home Assistant in your homelab.

jjcm 4 hours ago | parent | prev | next [-]

I think this is likely, but in a slightly different way - I think we're going to start seeing more LLMs baked into silicon a la Taalas' ASIC.

ie, something like this fake future apple device page: https://speculate-mai.pages.dev/

Octoth0rpe 5 hours ago | parent | prev | next [-]

> pretty soon you can buy one big ass server that will last potentially decades as it would be purpose built for ai.

This feels like a very, very weak prediction (though certainly possible).

jmalicki 5 hours ago | parent [-]

Perhaps if we truly run out of steam on the process node front?

Octoth0rpe 5 hours ago | parent [-]

Even if that happened tomorrow, I suspect we'd have _at least_ a decade of people tweaking/optimizing designs on the same node to squeeze meaningful performance upgrades out. Eg, coming up with hardware support for new int/float formats that make more sense for the models of 2029, running matrix operators on ram chips directly, etc.

runako 4 hours ago | parent [-]

I remember back in the early 2000s when people thought we were running out of steam on the advancements front. This was roughly around the time when CPU clocks stopped getting faster. Pentium hit 3GHz in 2005, Intel Core Ultra 5 performance cores are generally around this exact speed 20 years later.

Since at least the 640kb quip, betting against progress or the appetite for progress has been a losing bet.

jmalicki 2 hours ago | parent [-]

Honestly post 2005 things did slow down dramatically for typical single core workloads.

In the late 90s and early 2000s the mantra was "why waste time optimizing your software? By the time you're done the next gen of CPUs will have made up the difference."

Now the increase is more about moving to GPUs and power efficiency etc. We still have increases, but the rate of speedup has slowed down a lot.

jagged-chisel 5 hours ago | parent | prev | next [-]

And it's not going to happen any time soon because there's no recurring revenue to be gained from users/homeowners for such a thing.

trout_scout 5 hours ago | parent | next [-]

There's potential case for a subscription model to keep security updated for the connection to the users' phones as well as on going support for less tech savvy users (e.g. "I told my assistant to turn on my smart dishwasher and it turned on the my smart washing machine instead"). I'd imagine the HN crowd would lean toward a open source version though.

anoopengineer 5 hours ago | parent | prev | next [-]

With that logic, there wouldn't be anyone selling refrigerators or dishwashers.

idle_zealot 5 hours ago | parent | next [-]

If dishwashers were invented today they would be rented out to homes and businesses with DRM to lock you into buying approved detergent and tableware. Times change, and more exploitative arrangements are normalized. This ratchet is primed to go in one direction, and only moves the other way in fits and starts borne of great effort.

ar_lan 3 hours ago | parent | prev | next [-]

I wouldn't be surprised if there was some plan to generate a subscription model for appliances.

qsera 5 hours ago | parent | prev | next [-]

I take it that you have never come across the idea of "planned obsolescence"..

re-thc 5 hours ago | parent | prev | next [-]

A lot of the leaders of that century have been going downhill, ever since, e.g. top Japanese manufacturers.

aegis_camera 5 hours ago | parent | prev [-]

:)

psyclobe 4 hours ago | parent | prev [-]

Well, custom/bespoke training for your families particular needs perhaps, performed once every 5 years.

I mean I envision analog/custom/bespoke ai hardware that is fundamentally 'good enough'. I mean as the market increases its need for these systems and as time progresses at some point it'll like warhammer 30k where these 'standard template constructs' are smart enough to basically teach you anything.

anoncow 3 hours ago | parent | prev | next [-]

Reminds me of how 12, Grimmauld Place works in the Harry Potter books. With an AI server the enchantments could be so much better.

icedchai 4 hours ago | parent | prev | next [-]

Based on our current trajectory, it seems more likely everyone will upload everything to the cloud and pay perpetual royalties to access their own data.

psyclobe 4 hours ago | parent [-]

I really think this is a temporary scenario, there will be advancements in ai's building the next generation of ais, where the scale of the model continually shrinks and maybe there will be some break through that allows us to double the use of existing hardware/memory etc.

10 years ago I couldn't do alexa at my house, now I'm pretty close with a Qwen3:8b / Ollamma LLM (I mean I never really wanted alexa to do anything other then play music, automate stuff, etc. zero interest in it teaching me how to code).

I'm even thinking at some point we'll consider ai to be a fundamental human right to have access too as otherwise you are inherently in a disadvantaged position in terms of wealth prospects to those who do have access.

HanClinto 5 hours ago | parent | prev | next [-]

Reminds me of the mainframe in The Moon is a Harsh Mistress.

aegis_camera 5 hours ago | parent | prev | next [-]

Thanks for your insight, hardware of AI will be cheaper and memory of footage would be always saved locally.

lm28469 4 hours ago | parent | prev | next [-]

This is your reminder we're in a bubble inside of a bubble...

Most people don't even think about running network cables or mesh wifi when building a house, no one will buy a server to run ai in their physical home

jiveturkey 5 hours ago | parent | prev [-]

> I have always envisioned a ai server being part of a family's major purchases

and an oxide rack