Remix.run Logo
mattlondon 11 hours ago

Yet there is another post a few rows down where people are losing their shit that Chrome has a local LLM model that uses a couple of GB of space for local-inference.

Damned if they do, damned if they don't.

dlcarrier 11 hours ago | parent | next [-]

Maybe don't use gigabytes of bandwidth and storage space, without asking.

hparadiz 10 hours ago | parent [-]

Easy. Stop using Chrome.

userbinator 9 hours ago | parent | prev | next [-]

If I want a model I'll go download one. (And I did, not long ago, to play around with image generation.)

bytecauldron 11 hours ago | parent | prev | next [-]

This is a bit disingenuous. People aren't losing their shit about a local model being installed. It's the lack of user autonomy. Just give the option to download a model instead of a silent install. It's not that hard. This is how every other local option works.

wmf 10 hours ago | parent [-]

AFAIK Apple and MS auto-download local models.

FridgeSeal 7 hours ago | parent | next [-]

The former has made a big deal about local inference and marketed that as an OS level feature.

You can also…turn it off.

Chrome silently elected people into it _and_ downloaded the model without asking because they decided that’s something they (chrome) fancied doing.

The difference should be pretty obvious.

bytecauldron 6 hours ago | parent | prev | next [-]

Sorry, I should have been more specific. This is how every *good local option works.

10 hours ago | parent | prev [-]
[deleted]
aabhay 11 hours ago | parent | prev | next [-]

This is a weird take. If its not opt in or you’re shoe horning it into a browser, then that sucks. Nobody is getting enraged that an app for running local LLMs downloads data to do so.

avadodin 10 hours ago | parent [-]

Although you can opt out and even disable the download feature when you build them in some cases, most of the local LLM tools are too download–happy by default.

fg137 10 hours ago | parent | prev | next [-]

You might want to read the comments to understand what people are actually complaining about.

This comment is quite dishonest about the nature of the discussion.

themafia 11 hours ago | parent | prev | next [-]

If it was such a good and laudable idea why didn't they tell me about it before they activated it? It seems to me like they avoided it in the hopes that I wouldn't notice, because, presumably if I had, I would have IMMEDIATELY disabled it.

Also why doesn't their task manager show that it's actually the one downloading? Why does it go out of it's way to hide this activity?

Since I have conky on my desktop I could catch this immediately, and take the action I preferred with my own computer, which was to _immediately_ disable it.

StilesCrisis 11 hours ago | parent [-]

I'm guessing you immediately close the What's New Chrome tab when you update?

https://developer.chrome.com/blog/new-in-chrome-148#prompt-a...

https://www.google.com/chrome/ai-innovations/

They have absolutely not been shy about any of this.

themafia 11 hours ago | parent [-]

I've never had a "What's new" tab ever open because I disable the customized home page where that's displayed. I'm guessing you're not aware that's an option.

Please show me where in either of those documents it explains it's going to download a 4GB model.

crazygringo 10 hours ago | parent [-]

I use an extension that gives me a customized homepage, but I still always get the "what's new" tab on every major version upgrade.

It's a totally separate tab that opens. It's got nothing to do with what you use as your homepage.

ekjhgkejhgk 11 hours ago | parent | prev [-]

You don't understand the difference between "I run a local LLM because I chose to" vs "The browser chose to run a local LLM and I have no say"? You don't understand?

Not to mention that the LLM that I choose to run requires a monster machine and is infinitely more capable than whatever google chose to put on their browser?

I mean, none of this affects me because I don't use chrome, obviously, but you don't see the difference? Bewildering.

StilesCrisis 11 hours ago | parent [-]

Did you opt into WebGPU? QUIC? Canvas 2D? Brotli? Browsers don't work that way.

za_creature 10 hours ago | parent [-]

The size difference between the local LLM and all of the above is about... the size of the local LLM.