Remix.run Logo
rvz 7 hours ago

Anthropic doesn't have anything else other than the Claude models.

But notice that no-one, not a single mention of Deepseek tells me that they are preparing to scare everyone again. Which is why Dario continues to scare-monger on local models.

Sometimes you do not need hundreds of billions of dollars for inference when it can be done locally with efficient software; and Google proved that. But where is the money in that? So continues the flawed belief in infinitely buying GPUs to scale which Nvidia needs you to do.

Only a matter of time for local models to reach Opus level. We are 1 or at most 2 years behind that and Anthropic knows that.

p12tic 7 hours ago | parent | next [-]

> Only a matter of time for local models to reach Opus level. We are 1 or at most 2 years behind that and Anthropic knows that.

Can confirm. Kimi K2.5 is pretty intelligent and most of the time there's no difference between Opus and Kimi.

randomNumber7 6 hours ago | parent | prev [-]

Local models just make no economic sense since the GPU will idle 99% of the time.

zozbot234 6 hours ago | parent | next [-]

You have a GPU already (at least an iGPU and an NPU on most newer platforms) as part of your computer, might as well get some use out of it with local inference. And trying to do inference on a larger model with an undersized GPU will have you idling a lot less than 99% - but that still makes a lot of sense for most casual users who will only rarely need a genuine "Pro" class answer from AI. Doing that locally is way less hassle than paying for a subscription or messing with API spend.

amazingamazing 4 hours ago | parent | prev | next [-]

False on a team that’s distributed

twoodfin 6 hours ago | parent | prev [-]

[dead]