Remix.run Logo
an0malous 6 hours ago

OpenRouter and other LLM platforms are being subsidized by VC investment to less than it costs them to run inference, the MacBook Pro is not

hankerapp 2 hours ago | parent | next [-]

Bingo. I, for one, am loving this phase of enjoying the LLMs at the expense of VC money. Just like how I enjoyed cheap rides and deliveries on Uber. And with the fragmentation in the field, I don't see a monopoly coming up.

Kwpolska 6 hours ago | parent | prev [-]

When the AI bubble inevitably pops, the author will find a new way to skew results in favor of cloud LLMs. Like including the price of a desk and a chair in the local token cost.

datadrivenangel 6 hours ago | parent [-]

I really wanted the laptop to look better cost-wise, but it doesn't.

an0malous 5 hours ago | parent [-]

I mean if you’re buying it just as an LLM inference server it’s not, but most people already have laptops, in which case it’s practically free