Remix.run Logo
motorest 10 days ago

> Fast-forward some number of years (...)

I repeat: OP's home server costs as much as a few months of a cloud provider's infrastructure.

To put it another way, OP can buy brand new hardware a few times per year and still save money compared with paying a cloud provider for equivalent hardware.

> Regarding the equivalent EC2 instance, I'm not comparing it to the cost of a homelab, I'm comparing it to the cost of an Anthropic Pro or Max subscription.

OP stated quite clearly their goal was to run models locally.

ac29 9 days ago | parent [-]

> OP stated quite clearly their goal was to run models locally.

Fair, but at the point you trust Amazon hosting your "local" LLM, its not a huge reach to just use Amazon Bedrock or something

motorest 9 days ago | parent [-]

> Fair, but at the point you trust Amazon hosting your "local" LLM, its not a huge reach to just use Amazon Bedrock or something

I don't think you even bothered to look at Amazon Bedrock's pricing before doing that suggestion. They charge users per input tokens + output tokens. In Amazon Bedrock, a single chat session involving 100k tokens can cost you $200. That alone is a third of OP's total infrastructure costs.

If you want to discuss options in terms of cost, the very least you should do is look at pricing.