| ▲ | schnitzelstoat a day ago | |
Yeah, I don't think local LLM's will keep up with what the massive corporations put out. But they might get to a level of performance where it just doesn't matter for most users. And people would prefer to run a model locally for 'free' (not counting the energy cost) rather than paying for an LLM subscription. | ||