Remix.run Logo
alwillis 3 days ago

> don't see why not, then the era of viable local LLM inferencing is upon us. I don't think local LLMs will ever be a thing except for very specific use cases.

I disagree.

There's a lot of interest in local LLMs in the LLM community. My internet was down for a few days and did I wish I had a local LLM on my laptop!

There's a big push for privacy; people are using LLMs for personal medical issues for example and don't want that going into the cloud.

Is it necessary to talk to a server just to check out a letter I wrote?

Obviously with Apple's release of iOS 26 and macOS 26 and the rest of their operating systems, tens of millions of devices are getting a local LLM with 3rd party apps that can take advantage of them.