| ▲ | apsurd 7 hours ago | |
What if the harness and loops get sufficiently better though? CC is using haiku for code-base gripping and such, you don't see a local commodity model being "good enough" for the 80% case when matched with better harnesses and tool calls? honest question, i'm very interested in this, but too casual as of now to know any better. | ||
| ▲ | byzantinegene 4 hours ago | parent [-] | |
vast majority of average users don't use llms for coding, and for those purposes, local llms with low param count are a far cry from SOTA models. | ||