▲ | hapticmonkey 6 days ago | ||||||||||||||||
If the future is AI, then a future where every compute has to pass through one of a handful of multinational corporations with GPU farms...is something to be wary of. Local LLMs is a great idea for smaller tasks. | |||||||||||||||||
▲ | tonyhart7 6 days ago | parent [-] | ||||||||||||||||
but its not the future, we already can do that right now the problem is people expectation, they want the model to be smart people aren't having problem for if its local or not, but they want the model to be useful | |||||||||||||||||
|