Remix.run Logo
voidfunc 2 days ago

Also, what kind of AI tasks is the average person doing? The people thinking about this stuff are detached from reality. For most people a computer is a gateway to talking to friends and family, sharing pictures, browsing social media, and looking up recipes and how-to guides. Maybe they do some tracking of things as well in something like Excel or Google Sheets.

Consumer AI has never really made any sense. It's going to end up in the same category of things as 3D TV's, smart appliances, etc.

ryandrake 2 days ago | parent | next [-]

I don't remember any other time in the tech industry's history when "what companies and CEOs want to push" was less connected to "what customers want." Nobody transformed their business around 3D TVs like current companies are transforming themselves to deliver "AI-everything".

walterbell 2 days ago | parent | next [-]

If memory shortages make existing products non-viable (e.g. 50% price increases on mini PCs, https://news.ycombinator.com/item?id=46514794), will consumers flock to new/AI products like OpenAI "pen" or reject those outright?

Tanoc 2 days ago | parent | prev | next [-]

I think it does make sense if you're at a certain level of user hardware. If you make local computing infeasible because of the computational or hardware cost it makes it much easier to sell compute as a service. Since about 2014 almost every single change to paid software has been to make it a recurring fee rather than a single payment, and now they can do that with hardware as well. To the financially illiterate paying a $15 a month subscription to two LLMs from their phone they have a $40 monthly payment on for two years seems like a better deal than paying $1,200 for a desktop computer with free software that they'll use a tenth as much as the phone. This is why Nvidia is offering GForce Now the same way in one hundred hour increments, as they can get $20 a month that goes directly to them, with the chance of getting up to an additional $42 maximum if the person buys additional extensions of equal amount (another one hundred hours). That ends up with $744 a year directly to Nvidia without any board partners getting a cut, while a mid grade GPU with better performance and no network latency would cost that much and last the user five entire years. Most people won't realize that long before they reach the end of the useful lifetime of the service they'll have paid three to four times as much as if they had just bought the hardware outright.

With more of the compute being pushed off of local hardware they can cheapen out on said hardware with smaller batteries, fewer ports and features, and weaker CPUs. This lessens the pressure they feel from consumers who were taught by corporations in the 20th century that improvements will always come year over year. They can sell less complex hardware and make up for it with software.

For the hardware companies it's all rent seeking from the top down. And the push to put "AI" into everything is a blitz offensive to make this impossible to escape. They just need to normalize non-local computing and have it succeed this time, unlike when they tried it with the "cloud" craze a few years ago. But the companies didn't learn the intended lesson last time when users straight up said that they don't like others gatekeeping the devices they're holding right in their hands. Instead the companies learned they have to deny all other options so users are forced to acquiesce to the gatekeeping.

jimbokun a day ago | parent | prev [-]

The customers are CEOs dreaming of a human-free work force.

shermantanktop a day ago | parent [-]

Suggested amendment: the customers are CEOs dreaming of Wall Street seeing them as a CEO who will deliver a human-free work force. The press release is the product. The reality of payrolls are incidental to what they really want: stock price go up.

It's all optics, it's all grift, it's all gambling.

tjr 2 days ago | parent | prev | next [-]

Just off the top of my head of some "consumer" areas that I personally encounter...

I don't want AI involved in my laundry machines. The only possible exception I could see would be some sort of emergency-off system, but I don't think that even needs to be "AI". But I don't want AI determining when my laundry is adequately washed or dried; I know what I'm doing, and I neither need nor want help from AI.

I don't want AI involved in my cooking. Admittedly, I have asked ChatGPT for some cooking information (sometimes easier than finding it on slop-and-ad-ridden Google), but I don't want AI in the oven or in the refrigerator or in the stove.

I don't want AI controlling my thermostat. I don't want AI controlling my water heater. I don't want AI controlling my garage door. I don't want AI balancing my checkbook.

I am totally fine with involving computers and technology in these things, but I don't want it to be "AI". I have way less trust in nondeterministic neural network systems than I do in basic well-tested sensors, microcontrollers, and tiny low-level C programs.

the_snooze 2 days ago | parent [-]

A lot of consumer tech needs have been met for decades. The problem is that companies aren't able to extract rent from all that value.

PunchyHamster 2 days ago | parent | prev | next [-]

I do think it makes some sense in limited capacity.

Have some half decent model integrated with OS's builtin image editing app so average user can do basic fixing of their vacation photos by some prompts

Have some local model with access to files automatically tag your photos, maybe even ask some questions and add tags based on that and then use that for search ("give me photo of that person from last year's vacation"

Similarly with chat records

But once you start throwing it in cloud... people get anxious about their data getting lost, or might not exactly see the value in subscription

fragmede 2 days ago | parent | prev | next [-]

You and I live in different bubbles. ChatGPT is the go-to for my non-techie friends to ask for advice on basically everything. Women asking it for relationship advice and medical questions, to guys with business ideas and lawsuit stuff.

chpatrick 2 days ago | parent | prev | next [-]

Consumer local AI? Maybe.

On the other hand everyone non-technical I know under 40 uses LLMs and my 74 year old dad just started using ChatGPT.

You could use a search engine and hope someone answered a close enough question (and wade through the SEO slop), or just get an AI to actually help you.

jimbokun a day ago | parent | prev [-]

“Do my homework assignment for me.”