| ▲ | annjose 9 hours ago | |
> today's best runnable-offline model is roughly 6–8 months behind today's frontier. But it doesn't matter because frontier models were extremely good 8 months ago and we were doing real work with them. Now we have more capable open-source agents like pi and OpenCode which work well with these models. More importantly, offline models is the best choice for privacy, on-device inference and no token/cost anxiety. | ||
| ▲ | swrrt 4 hours ago | parent | next [-] | |
Yep, offline mode is useful for edge devices also. I am considering deploying a extremely small model on steam deck actually. | ||
| ▲ | ubermon 7 hours ago | parent | prev [-] | |
Totally agree! I think we are very early on discovering the full potential of local models | ||