| ▲ | nacs 6 hours ago | |
> I really hope more people realize that local LLMs are where it's at No worries, the AI companites thought ahead - by sending GPU, RAM, and now even harddrive prices through the roof, you won't have a computer to run a local model. | ||