| ▲ | ElFitz 2 hours ago | |
> Including a 4gb is a negligible amount of space for current hardware and Chrome is not known as the browser to run on resource constrained devices. 4gb definitely isn’t a negligible amount of space on most people’s devices. The quite successful it would seem MacBook Neo has 256GB of storage in its base configuration. A MacBook Air and a basic sub $1000 Dell laptop starts at 512GB. > To put 4gb in context, I currently have 2 tabs open that nearly take up 4gb. You are conflating disk and memory. > The fact Chrome also has a way to disable this makes it kind of a nothingburger in my opinion. There’s a reason they picked an opt-out model for this, and not an opt-in approach. But I also see the point in it. We recently did a hackathon, and I considered relying on Gemma 4 for privacy considerations. The local model could interpret the user’s natural language request and derive less privacy revealing requests to form based on that. But then, a web app that shows people a loading screen while it downloads a 4GB model probably wouldn’t be a best-selling UX. | ||
| ▲ | derangedHorse 2 hours ago | parent [-] | |
> You are conflating disk and memory. I never conflated anything. I said it's a neglible amount of space for current hardware, which I still believe. If anything, the fact that I think the amount of space is acceptable for the amount of ram a modern laptop has exaggerates the point. > There’s a reason they picked an opt-out model for this, and not an opt-in approach. That's the approach they take for most of their features. > But then, a web app that shows people a loading screen while it downloads a 4GB model probably wouldn’t be a best-selling UX. Which seems to be the motivation of having these local models embedded in the browser's available resources: https://developer.chrome.com/docs/ai/prompt-api | ||