| ▲ | inciampati 5 hours ago | |
I'm already living in this future. In a decent execution framework, with context management, memory via unix, and mechanisms for web search and access, local models are effectively on par with frontier ones. And they can often be much faster. I'll keep paying fees for the AI companies until they stop truly subsidizing and leading. They are getting close to the edge of utility, but we can use their services now to bootstrap their own demise. Long live running your own software on your own computer. | ||
| ▲ | gwerbin 20 minutes ago | parent [-] | |
What setup are you using? What models, what hardware, what agent harness, etc? I have the vague sense that this is all possible right now, but the amount of tinkering required doesn't seem worth it compared to, like, just not using AI and getting stuff done the old fashioned way. | ||