▲ | motorest 10 days ago | |||||||
> Fast-forward some number of years (...) I repeat: OP's home server costs as much as a few months of a cloud provider's infrastructure. To put it another way, OP can buy brand new hardware a few times per year and still save money compared with paying a cloud provider for equivalent hardware. > Regarding the equivalent EC2 instance, I'm not comparing it to the cost of a homelab, I'm comparing it to the cost of an Anthropic Pro or Max subscription. OP stated quite clearly their goal was to run models locally. | ||||||||
▲ | ac29 9 days ago | parent [-] | |||||||
> OP stated quite clearly their goal was to run models locally. Fair, but at the point you trust Amazon hosting your "local" LLM, its not a huge reach to just use Amazon Bedrock or something | ||||||||
|