▲ | zeta0134 2 days ago | ||||||||||||||||||||||
What would be the point of an "AI PC" if not to run models locally? I'm very much uncomfortable with sending my keystrokes (or my codebase) off to some remote server. If I can run the model locally, the privacy problems in theory vanish and I'm much more likely to use the tech. If folks don't understand that, then yes I'd say it's a pretty big misconception and needs to be better marketed as a key feature. And if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? You can already do the remote chatbot thing with a regular PC, privacy nightmares included! | |||||||||||||||||||||||
▲ | defnotai 2 days ago | parent | next [-] | ||||||||||||||||||||||
Access to the latest foundation models, which can’t be run locally. AI feels like it’s in this really weird place where the latest Claude model sets expectations that can’t be matched by an on-device model. Even Apple Intelligence is getting a lot of negative feedback by the review crowd due to its limitations like being unable to summarize a very large document (which is pretty much the point of such a feature). The problem is that AI has few well-defined use cases and a mountain of expectations, and this really shows in the execution by these companies. It’s hard to build good products when the requirements are “we don’t really know” | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | HWR_14 2 days ago | parent | prev | next [-] | ||||||||||||||||||||||
Tons of software runs locally but then still exfiltrates your data for a variety of reasons. Ads, product improvement, metrics, cross-device syncing, etc. | |||||||||||||||||||||||
▲ | a2128 2 days ago | parent | prev | next [-] | ||||||||||||||||||||||
> if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? To trick people into buying new hardware, lest they get left behind in the AI race | |||||||||||||||||||||||
▲ | throwaway290 2 days ago | parent | prev [-] | ||||||||||||||||||||||
> What would be the point of an "AI PC" if not to run models locally? Unrealistic for now because running ML is slow but even if yes, even laypeople know by now you can break an LLM to do what it isn't supposed to. Since this LLM has unlimited access to your personal data to be useful, if I get to it I don't even need to bypass any secure enclaves or what not because it will tell me things I ask for in plain <insert your language> All eggs in one basket |