Remix.run Logo
jbverschoor 4 days ago

Local llm.. everybody is scared of privacy.. many people don’t want to buy subscriptions (still).

Just sell a proper HomePod with 64GB-128GB ram, which handles everything including your personal LLM, Time Machine if needed, back to Mac (Tailscale/zerotier)

+ they can compete efficiently with the other. Cloud providers.

brookst 4 days ago | parent | next [-]

It’s a mistake to generalize from the HN population.

Most people don’t care about privacy (see: success of Facebook and TikTok). Most people don’t care about subscriptions (see: cable TV, Netflix).

There may be a niche market for a local inference device that costs $1000 and has to be replaced every year or two during the early days of AI, but it’s not a market with decent ROI for Apple.

j45 3 days ago | parent [-]

An iPhone, Macbook, etc all cost in the $1000 range.

There was a post about the new iphone using A19, which includes a feature that makes local inference much easier.

If that makes it to M5, I think the local inference case continues to grow with each M processor.

bigyabai 4 days ago | parent | prev | next [-]

> Just sell a proper HomePod with 64GB-128GB ram

The same Homepod that almost sold as poorly as Vision Pro despite a $349.99 MSRP? Apple charges $400 to upgrade an M4 to 64GB and a whopping $1,200 for the 128GB upgrade.

The consumer demand for a $800+ device like this is probably zilch, I can't imagine it's worth Apple's time to gussy up a nice UX or support it long-term. What you are describing is a Mac with extra steps, you could probably hack together a similar experience with Shortcuts if you had enough money and a use-case. An AI Homepod-server would only be efficient at wasting money.

redundantly 4 days ago | parent [-]

> The same Homepod that almost sold as poorly as Vision Pro despite a $349.99 MSRP?

The HomePod did poorly because competitor offerings with similar and better performing features were priced under $100. The difference in sound quality was not worth the >3x markup.

VagabundoP 4 days ago | parent | prev [-]

Have a team pushing out opitmised open source models. Over time this thing could become the house AI. Basically Star Treks computer.