| ▲ | zozbot234 5 hours ago | |
> i.e. plans/API calls that make this practical at scale are expensive Local AI's make agent workflows a whole lot more practical. Making the initial investment for a good homelab/on-prem facility will effectively become a no-brainer given the advantages on privacy and reliability, and you don't have to fear rugpulls or VC's playing the "lose money on every request" game since you know exactly how much you're paying in power costs for your overall load. | ||
| ▲ | slopusila 5 minutes ago | parent | next [-] | |
on prem economics dont work because you can't batch requests. unless you are able to run 100 agents at the same time all the time | ||
| ▲ | vbezhenar 4 hours ago | parent | prev [-] | |
I don't care about privacy and I didn't have much problems with reliability of AI companies. Spending ridiculous amount of money on hardware that's going to be obsolete in a few years and won't be utilized at 100% during that time is not something that many people would do, IMO. Privacy is good when it's given for free. I would rather spend money on some pseudo-local inference (when cloud company manages everything for me and I just can specify some open source model and pay for GPU usage). | ||