| ▲ | zozbot234 6 hours ago | |
> Can you run an industry level LLM at home? Assuming that by "at home" you mean using ordinary hardware, not something that costs as much as a car. Yes, very slowly, for simple tests. (Not proprietary models obviously, but quite capable ones nonetheless.) Not exactly viable for agentic coding that needs boatloads of tokens for the simplest things. But then you can run smaller local models that are still quite capable for many things. | ||