Remix.run Logo
A1kmm 2 days ago

I find the concept of "AI PC" to be somewhat nonsensical in the absence of a definition that is about the hardware.

Just working out the age of my personal desktop computer has a Ship of Theseus problem - but safe to say 20+ years. However, it now has a graphics card with an RTX 3060 with 12 GB of GPU, and NVMe SSDs, and can run inference on 7B parameter 4 bit quantised Transformer LLMs, and generate images with large diffusion models. I've also used it for many applications that would count as AI before the latest generative AI hype cycle.

So is it an AI PC? At what point did it become an AI PC? Or is a self-built machine in which you swap parts inherently never an AI PC?

Given the fact that it is so amorphously defined, I would consider the term to be purely marketing fluff.

palmfacehn 2 days ago | parent | next [-]

In the 90's there were "Multimedia PCs"

https://en.wikipedia.org/wiki/Multimedia_PC

safety1st 2 days ago | parent | prev | next [-]

Assuming AI means LLM, at this stage I've come across two broad categories of implementation that are actually interesting and useful to me as a user.

1) A box on the screen where I can chat with one to do ideation or really anything I want.

2) A command-driven approach where I hit a hotkey, type a prompt and the response is dumped out in front of me, possibly I had some text selected which heavily influences the response.

These are both pretty cool tbh and developers will have a field day for years finding sensible ways to incorporate them into programs.

None of this has anything to do with driving the hardware upgrade cycle since most of the models are running in the cloud. But driving hardware upgrades is what these marketing people are really trying to do when they talk about AI PC. They are irrelevant people, but they need to convert everything they see into a reason to buy a new PC.

That's what they get paid for. Monkey marketer see trend, monkey marketer do marketing. Monkey steal your attention.

Maybe a LLM will replace THEM soon. After all it's basically a digital version of the million monkeys on typewriters...

hunter2_ 2 days ago | parent | prev [-]

I think we just have a tendency to anthropomorphize things that are a bit too complicated to understand fully. Like a child calling the clutch mechanism in a yo-yo a "brain" for example. It's not that the yo-yo can really think, it's just that it has a behavior that seems that way. So indeed once you've upgraded your system to the point of not fully understanding how something that isn't a human could achieve whatever emergent behavior occurs, go ahead and anthropomorphize it by calling it AI.

There's not a specific line in the sand, although tasking it with machine learning (in which outcomes improve based on collecting runtime inputs, rather than based only on its creator adding capabilities) would be a decent one. That's fairly human-like, while non-ML workloads are more plant-like.