Remix.run Logo
hapticmonkey 6 days ago

If the future is AI, then a future where every compute has to pass through one of a handful of multinational corporations with GPU farms...is something to be wary of. Local LLMs is a great idea for smaller tasks.

tonyhart7 6 days ago | parent [-]

but its not the future, we already can do that right now

the problem is people expectation, they want the model to be smart

people aren't having problem for if its local or not, but they want the model to be useful

aurareturn 5 days ago | parent [-]

Sure, that's why local LLMs aren't popular or mass market as of September 2025.

But cloud models will have diminishing returns, local hardware will get drastically faster, and techniques to efficiently inference them will be worked out further. At some point, local LLMs will have its day.

tonyhart7 5 days ago | parent [-]

only in theory and that's not gonna be happening

this is the same happening with software and game industry

because free market forces people to raise the bar every year, the requirement of apps and games never met. its only goes up

human would never be satisfied, boundary would be push further

that's why we have 12gb or 16gb ram for smartphone right now only for system + apps

and now we must accommodate for local LLM too??? it would only goes up, people would demand smarter and smarter model

frontier model today would deem unusable(dumb) in 5 years

example: people literally screaming in agony when Antrophic quantized their model