▲ | aurareturn 5 days ago | |
Sure, that's why local LLMs aren't popular or mass market as of September 2025. But cloud models will have diminishing returns, local hardware will get drastically faster, and techniques to efficiently inference them will be worked out further. At some point, local LLMs will have its day. | ||
▲ | tonyhart7 5 days ago | parent [-] | |
only in theory and that's not gonna be happening this is the same happening with software and game industry because free market forces people to raise the bar every year, the requirement of apps and games never met. its only goes up human would never be satisfied, boundary would be push further that's why we have 12gb or 16gb ram for smartphone right now only for system + apps and now we must accommodate for local LLM too??? it would only goes up, people would demand smarter and smarter model frontier model today would deem unusable(dumb) in 5 years example: people literally screaming in agony when Antrophic quantized their model |