Remix.run Logo
Kwpolska 6 hours ago

When the AI bubble inevitably pops, the author will find a new way to skew results in favor of cloud LLMs. Like including the price of a desk and a chair in the local token cost.

datadrivenangel 6 hours ago | parent [-]

I really wanted the laptop to look better cost-wise, but it doesn't.

an0malous 5 hours ago | parent [-]

I mean if you’re buying it just as an LLM inference server it’s not, but most people already have laptops, in which case it’s practically free