| ▲ | AI PCs make users less productive(theregister.com) |
| 71 points by rntn 2 days ago | 50 comments |
| |
|
| ▲ | galleywest200 2 days ago | parent | next [-] |
| > Lack of familiarity with AI PCs leads to what the study describes as "misconceptions," which include the following: 44 percent of respondents believe AI PCs are a gimmick or futuristic; 53 percent believe AI PCs are only for creative or technical professionals; 86 percent are concerned about the privacy and security of their data when using an AI PC; and 17 percent believe AI PCs are not secure or regulated. Is being concerned about your privacy really a misconception here? |
| |
| ▲ | add-sub-mul-div 2 days ago | parent | next [-] | | It's propaganda, as the article points out: "The chipmaker, which is quite keen to see people buy the AI PCs sold by its hardware partners," | |
| ▲ | zeta0134 2 days ago | parent | prev | next [-] | | What would be the point of an "AI PC" if not to run models locally? I'm very much uncomfortable with sending my keystrokes (or my codebase) off to some remote server. If I can run the model locally, the privacy problems in theory vanish and I'm much more likely to use the tech. If folks don't understand that, then yes I'd say it's a pretty big misconception and needs to be better marketed as a key feature. And if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? You can already do the remote chatbot thing with a regular PC, privacy nightmares included! | | |
| ▲ | defnotai 2 days ago | parent | next [-] | | Access to the latest foundation models, which can’t be run locally. AI feels like it’s in this really weird place where the latest Claude model sets expectations that can’t be matched by an on-device model. Even Apple Intelligence is getting a lot of negative feedback by the review crowd due to its limitations like being unable to summarize a very large document (which is pretty much the point of such a feature). The problem is that AI has few well-defined use cases and a mountain of expectations, and this really shows in the execution by these companies. It’s hard to build good products when the requirements are “we don’t really know” | | |
| ▲ | Philadelphia 2 days ago | parent [-] | | Looking at the logs for Apple Intelligence on my iPhone 15 Pro Max, almost everything is actually run remotely | | |
| |
| ▲ | HWR_14 2 days ago | parent | prev | next [-] | | Tons of software runs locally but then still exfiltrates your data for a variety of reasons. Ads, product improvement, metrics, cross-device syncing, etc. | |
| ▲ | a2128 2 days ago | parent | prev | next [-] | | > if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? To trick people into buying new hardware, lest they get left behind in the AI race | |
| ▲ | throwaway290 2 days ago | parent | prev [-] | | > What would be the point of an "AI PC" if not to run models locally? Unrealistic for now because running ML is slow but even if yes, even laypeople know by now you can break an LLM to do what it isn't supposed to. Since this LLM has unlimited access to your personal data to be useful, if I get to it I don't even need to bypass any secure enclaves or what not because it will tell me things I ask for in plain <insert your language> All eggs in one basket |
| |
| ▲ | kardos 2 days ago | parent | prev | next [-] | | Do the AI PCs do the work locally or transmit everything remotely for processing like chatgpt? | | |
| ▲ | Borborygymus a day ago | parent [-] | | There's not really a standard definition of an AI PC, but if there's anything beyond marketing, it would take the form of a neural processing unit, which supposedly makes some AI-related task fast by providing hardware acceleration. Cited examples include voice recognition and "training models", although I'm pretty sceptical about the later being at any significant scale. I suspect what it will actually boil down to in many cases is having recall enabled by default and an always-listening voice assistant. |
| |
| ▲ | Dalewyn 2 days ago | parent | prev [-] | | It's also not a misconception that "AI" PCs are for creative/technical professionals when the vast majority of the marketing is about shitting out creative(?) or technical(?) works(?) to reduce and ultimately democratize that workforce. | | |
| ▲ | dr_kiszonka 2 days ago | parent [-] | | "Democratize" used to have positive connotations, as in "democratize access to information". I didn't anticipate that it would have negative ones. | | |
| ▲ | boomlinde 21 hours ago | parent | next [-] | | I just think that "democratize" is the wrong word. AI tools that spit out prompt-based songs are enabling people not to create music, which is already available as an option to exactly everyone. Instead, centralized AI tools will be creating more of our music for us, which is the exact opposite of democratization. | | |
| ▲ | Dalewyn 18 hours ago | parent [-] | | >AI tools that spit out prompt-based songs are enabling people not to create music, What level of mental gymnastics is this? This is like saying that cars enable people to not travel. Music is in the ears of the beholder, it doesn't matter how it was produced. If a tool lets more people compose music, which "AI" does, then it's democratizing composing music. Likewise: Crossbows and then guns democratized violence. Cars democratized personal transport. Currency democratized procurement. The printing press democratized writing. The internet democratized information. Democratizing is about making something more accessible to the commons. Whether that will be good or bad is not the concern of democracy. | | |
| ▲ | boomlinde 16 hours ago | parent [-] | | > What level of mental gymnastics is this? This is like saying that cars enable people to not travel. No, sitting in a car that's traveling, you are actually traveling with it regardless of whether you are driving it, so it's entirely unlike saying that cars enable people not to travel. It's more like saying that giving everyone access to a private robot driver doesn't somehow democratize driving and instead enables people not to drive. You can instruct the driver to drive where you want to go on your behalf, and you can travel along with them in the car, but it hasn't somehow enabled anyone but the robot to drive. It has however enabled you not to drive: even in situations where you would otherwise have driven you can now rely on your robot driver to do it for you. What the robot driver has democratized is access to something that can drive for you, not driving. Similarly, what AI/ML/LLM/whatever-based music generators and the like have democratized is access to something that can create music for you, not creating music. Not even the ATC can be considered to be pilots simply for giving instructions to others who fly planes, even though they give pilots much more detailed and involved instructions than anyone gives e.g. Suno. > Music is in the ears of the beholder, it doesn't matter how it was produced. Even if we assume that this is true, it has no bearing on my argument: whether or not it matters how music was produced is irrelevant to the question of who or what created the music. > If a tool lets more people compose music, which "AI" does, then it's democratizing composing music. But it doesn't let more people compose music. The "AI", for example Suno, creates the music for you, and your input is more akin to the instructions you give to the driver: extremely high level and entirely removed from the work involved in producing the result. Similarly, as a composer, writing a piece that performing artists then interpret and use as instructions for a performance doesn't somehow make me a performing artist, despite your instructions being much more detailed than any instruction that has ever been given to Suno, and therefore enabling the composer to be more involved in the outcome of the performance. > Likewise: Crossbows and then guns democratized violence. Will you agree that the degree of involvement has any bearing on the judgement whether it is something I do or it's something others do for me? For example, that having private robot driver driving me around at my behest doesn't make me a driver myself, but aiming and firing a gun at someone with the intent to cause violence to them makes me violent, despite the fact that I'm not the bullet that ultimately caused the harm? If so, the reductionist argument is misleading and we have to consider the degree to which you are involved in the process of creation when you tell Suno to produce a song. > Whether that will be good or bad is not the concern of democracy. Whether it's good or bad is also completely irrelevant to my argument, and seemingly also to your response to it, so why bring it up at all? | | |
| ▲ | Dalewyn 16 hours ago | parent [-] | | The flaw in your faulty logic is that you are attributing intent and purpose to a tool rather than the user of the tool. Additionally, you might also be attributing a persona to a static and mindless tool which is also nonsense. An "AI" is a tool just like a car or a crossbow, they enable its user to do something more easily. Cars democratize travel, crossbows democratize violence, "AI"s democratize productivity. >Whether it's good or bad is also completely irrelevant to my argument, You literally replied to a comment concerning the possibility of the word "democratize" having innate positive and/or negative meanings. | | |
| ▲ | boomlinde 10 hours ago | parent | next [-] | | > The flaw in your faulty logic is that you are attributing intent and purpose to a tool rather than the user of the tool. I'm certainly don't attribute any intent to anything in my reasoning, but yes, most tools have a purpose. Suno's purpose, for example, is clearly to create music based on natural language prompts. Not that I have expressed that before, or that my argument relies on that fact, so whether or not you agree, it's irrelevant to what I'm saying. > Additionally, you might also be attributing a persona to a static and mindless tool which is also nonsense. I might? You should respond to my argument, not address things you baselessly think I believe, which is a waste of both our time. > You literally replied to a comment concerning the possibility of the word "democratize" having innate positive and/or negative meanings. I replied to say that "democratize" is the wrong word. My argument to that end doesn't hinge on whether democratization is a positive thing or a negative one. This discussion would be much easier if you responded to my points directly instead of trying to mischaracterize my argument as something it is not. Your response here is very light on directly addressing anything I said, and heavy on insisting that I believe things that I don't. | | |
| ▲ | Dalewyn 4 hours ago | parent [-] | | >This discussion would be much easier if you responded to my points directly I can't because you're talking nonsense. You are essentially saying that guns commit murder, not the user who shot the gun. You are saying that the tool carries intent and purpose, not the user. It's ridiculous, faulty logic and I cannot respond to that other than to point out it's quackery. |
| |
| ▲ | namaria 14 hours ago | parent | prev [-] | | If prompting some AI to create music is making music, those retired seniors watching construction work are building infrastructure. | | |
| ▲ | Dalewyn 4 hours ago | parent [-] | | Retired people by definition are not working, but that aside... Are you saying that programmers don't create software because they just "prompt a compiler"? Are you saying authors don't write books because they type into a word processor? Are you saying illustrators and artists don't draw because they are using a tablet? Are you saying construction workers aren't constructing because they're using excavators and cranes? I appreciate that many people suffer from AI Derangement Syndrome, but it's a ridiculous notion. "AI" is a tool like any other tool. |
|
|
|
|
| |
| ▲ | Spivak 2 days ago | parent | prev | next [-] | | It shouldn't, one of my friends is a professional copywriter and she's very pro-AI because it raises the writing bar for the devs she works with. "I can write better than AI, but AI can write better than you, use AI" is her go-to line with her coworkers. | | |
| ▲ | boomlinde 21 hours ago | parent [-] | | So does AI democratize copywriting? It seems it makes it enables users not to write which is already an option to everyone. It doesn't enable users to write in anyway; instead it writes for you. Deferring to an oracle that writes for you is the exact opposite of democratization. |
| |
| ▲ | Dalewyn 2 days ago | parent | prev [-] | | When everyone is special noone is, as the old saying goes. There's nothing herently good or bad about everyone getting access to or becoming capable of something. |
|
|
|
|
| ▲ | hunter2_ 2 days ago | parent | prev | next [-] |
| I was looking at laptops recently, and I noticed that the marketing blurbs on product pages are really getting extreme with the AI stuff in a weird SEO-like way. For example, this is on Costco's page for an Acer: > AI Ready for Tomorrow > Ready for the ever-evolving possibilities of AI? This Swift Go AI PC integrates Intel’s new dedicated AI engine—Intel® AI Boost—with Acer’s own AI solutions, for more intuitive and enjoyable AI experiences. It really seems like some kind of stupid joke. |
| |
|
| ▲ | A1kmm 2 days ago | parent | prev | next [-] |
| I find the concept of "AI PC" to be somewhat nonsensical in the absence of a definition that is about the hardware. Just working out the age of my personal desktop computer has a Ship of Theseus problem - but safe to say 20+ years. However, it now has a graphics card with an RTX 3060 with 12 GB of GPU, and NVMe SSDs, and can run inference on 7B parameter 4 bit quantised Transformer LLMs, and generate images with large diffusion models. I've also used it for many applications that would count as AI before the latest generative AI hype cycle. So is it an AI PC? At what point did it become an AI PC? Or is a self-built machine in which you swap parts inherently never an AI PC? Given the fact that it is so amorphously defined, I would consider the term to be purely marketing fluff. |
| |
| ▲ | palmfacehn 2 days ago | parent | next [-] | | In the 90's there were "Multimedia PCs" https://en.wikipedia.org/wiki/Multimedia_PC | |
| ▲ | safety1st 2 days ago | parent | prev | next [-] | | Assuming AI means LLM, at this stage I've come across two broad categories of implementation that are actually interesting and useful to me as a user. 1) A box on the screen where I can chat with one to do ideation or really anything I want. 2) A command-driven approach where I hit a hotkey, type a prompt and the response is dumped out in front of me, possibly I had some text selected which heavily influences the response. These are both pretty cool tbh and developers will have a field day for years finding sensible ways to incorporate them into programs. None of this has anything to do with driving the hardware upgrade cycle since most of the models are running in the cloud. But driving hardware upgrades is what these marketing people are really trying to do when they talk about AI PC. They are irrelevant people, but they need to convert everything they see into a reason to buy a new PC. That's what they get paid for. Monkey marketer see trend, monkey marketer do marketing. Monkey steal your attention. Maybe a LLM will replace THEM soon. After all it's basically a digital version of the million monkeys on typewriters... | |
| ▲ | hunter2_ 2 days ago | parent | prev [-] | | I think we just have a tendency to anthropomorphize things that are a bit too complicated to understand fully. Like a child calling the clutch mechanism in a yo-yo a "brain" for example. It's not that the yo-yo can really think, it's just that it has a behavior that seems that way. So indeed once you've upgraded your system to the point of not fully understanding how something that isn't a human could achieve whatever emergent behavior occurs, go ahead and anthropomorphize it by calling it AI. There's not a specific line in the sand, although tasking it with machine learning (in which outcomes improve based on collecting runtime inputs, rather than based only on its creator adding capabilities) would be a decent one. That's fairly human-like, while non-ML workloads are more plant-like. |
|
|
| ▲ | Nevermark 2 days ago | parent | prev | next [-] |
| This is a natural phase of major tech breakthroughs. It is not a negative indicator. The opposite. It is normal for most tech to have an exploratory period, where its potential is clear, but its immediate economic impact is negative during iterations of product-market fit adaptation. Normally, the producer of new tech eats most or all of the risk and cost of the search for product-market fit. But some tech is so compelling, that customers feel the strategic need to participate in the discovery loop too. Obviously, there are upfront costs and risk deploying/trying tech that is still hit-and-miss. But during a sea change, there is also risk in not experimenting and adopting/adapting early. |
|
| ▲ | jart 2 days ago | parent | prev | next [-] |
| > Our role as technology leaders is to support this transition to AI-assisted living Why is it that we all woke up one morning and every corporation is suddenly saying this same thing? |
| |
| ▲ | justinclift 2 days ago | parent | next [-] | | Their execs are probably all subscribed to the same corporate leadership magazines? | |
| ▲ | benfortuna 2 days ago | parent | prev [-] | | Just as with 3D televisions, be patient and this trend will pass. | | |
| ▲ | jpgvm 2 days ago | parent [-] | | Televisions have been fairly egregious in recent decades. Curved and 3D the primary suspects. Unfortunately "smart" turned out to be a way to make money despite the protests of the consumers so that isn't likely to be phased out/fixed unless regulation separates them from their ill-gotten partnership revenue. |
|
|
|
| ▲ | maeil 2 days ago | parent | prev | next [-] |
| > The chipmaker, which is quite keen to see people buy the AI PCs sold by its hardware partners Hah, no they very much aren't, as Intel are an insignificant player at the moment. As long as Intel is as far behind as they are, they'd rather overall investment in hardware goes down. The second Intel comes out with a leading chip, you'll suddenly see them come up with a study with the opposite result. |
|
| ▲ | rileymat2 2 days ago | parent | prev | next [-] |
| I am not an AI booster, but I would expect a learning curve slowing down current tasks for pretty much any technology that needs to be learned. |
| |
| ▲ | namaria 14 hours ago | parent [-] | | Fun fact: a "learning curve" was originally statistical evidence of a system getting better at something. It used to indicate that there can be improvement over initial performance when a process is repeated. I find it interesting that the meaning has shifted now to "this is hard at first" or "it takes a while to get results". |
|
|
| ▲ | hulitu 2 days ago | parent | prev | next [-] |
| > AI PCs make users less productive Idiot discovers that more "tools" make users less productive. |
|
| ▲ | floppiplopp 21 hours ago | parent | prev | next [-] |
| "3D-ready TV" |
|
| ▲ | kkfx 2 days ago | parent | prev | next [-] |
| Commercial software and ignorance make users less productive... I do not spend such time in finding my files thanks to Emacs/org-mode/org-attach to master them, I do not waste time with Office suite to prepare documents, I use LaTeX and org-mode and that's hyper faster once learned. Essentially FLOSS tools must be learned in time, typically at school, than you can profit for life, commercial software is an endless low-learning process full of frustration to do anything. No LLM can solve that, it's a chosen design for business purpose. |
| |
| ▲ | uludag 2 days ago | parent [-] | | > Commercial software and ignorance make users less productive... This is exactly what I have observed too. Notice how all commercial software comes very polished and modern with a very superficial UI. Despite most software products being hundreds of times more complex than simple electronic gadgets, such gadgets often come with more complex manuals. It's obvious that the true end-game of AI is not to make end users more productive, but rather de-skill end-users and make them entirely beholden to third-party software. Most companies would love the tradeoff of making their employees more like cogs at the cost of lower productivity. Alas, I too am held hostage by Emacs as most software doesn't even come close to the flexibility it provides. | | |
| ▲ | kkfx 2 days ago | parent [-] | | I suggest https://youtu.be/5yy6XvuO2aM summarized to "we want LLMs as life companion of our customers so we can drive their choices an LLM user interaction at a time". Of course such model can't work much with skilled customers. In the end is the same plot in every time, in 1894 an Italian Education Minister for SEVEN times, Guido Baccelli, stated "we need to teach just to read and write, be careful teaching history, we MUST set aside anti-dogmatism and critical thinking, the people MUST NOT THINK or we will be in trouble". We see the same trend everywhere, in every profession as well, even in education https://www.theatlantic.com/ideas/archive/2020/08/i-was-usef... the issue is that such people are unable to innovate and smart and educated people are always scarce so if you do not cultivate knowledge you simply lost it, knowledge is the sole "natural resources" that grow with use instead of the contrary. |
|
|
|
| ▲ | digitcatphd 2 days ago | parent | prev [-] |
| The reality is AI is largely in the uncanny valley right now. This is great because it weeds out the noise and gives the focused long term builders breathing room. |