▲ | simonw 5 days ago | |
The gpt-oss-20b model has demonstrated that a machine with ~13GB of available RAM can run a very decent local model - if that RAM is GPU-accessible (as seen on Apple silicon Macs for example) you can get very usable performance out of it too. I'm hoping that within a year or two machines like that will have dropped further in price. |