| ▲ | dkobia 20 hours ago | |
I’m running local models with a maxed out M4 but I find local models only useful and reliable for trivial tasks and sensitive items like database optimization work. Local LLMs just don’t come anywhere close to Claude or Codex for heavy work. | ||