| ▲ | NekkoDroid 2 days ago | |
Yea, I've been waiting a while for a model that is ~12-13GB so there is still a bit of extra headroom for all the different things running on the system that for some reason eat VRAM. | ||
| ▲ | vparseval a day ago | parent [-] | |
I found that you can run models locally pretty well that exceed your VRAM by a bit. At least ollama will hand excess off to your system RAM. Maybe performance suffers but I've never actually seen it crap out and I can wait a few minutes for a response. | ||