▲ | tonyhart7 6 days ago | |||||||
but its not the future, we already can do that right now the problem is people expectation, they want the model to be smart people aren't having problem for if its local or not, but they want the model to be useful | ||||||||
▲ | aurareturn 5 days ago | parent [-] | |||||||
Sure, that's why local LLMs aren't popular or mass market as of September 2025. But cloud models will have diminishing returns, local hardware will get drastically faster, and techniques to efficiently inference them will be worked out further. At some point, local LLMs will have its day. | ||||||||
|