▲ | pogue 2 days ago | |
> Weirdly, from their help page[1] they mention needing to "Have a high performance computer" as a requirement, and that >> When you turn on "History search, powered by AI," in addition to the page title and URL, the page contents of the website you browse at that time are stored locally. > and that the contents are even encrypted at rest, which makes you start to think they did it the right way, but then, no AI is like a beast that has to be continuously fed forever to keep growing, and of course Google knows this. So, they're always going to take your data so they can feed their beast to try and stay, at least abreast, if not ahead of the competition. I'm sure Google is also getting the message from publishers that they're getting sick of having their websites scrapped by GoogleBot only for those results to wind up in the AI Summary & not lead to any actual traffic. So, what could Google do? What if they made everyone who ran Chrome scrape that data for them vicariously just through normal browsing? Not only that, what if in addition to having them scrape that data, but to also process it locally on your computer to save on cloud computing costs? Just a theory... ;) > FWIW this has been shipping for a long time. Try doing a reservation through google maps. If there's not open table support or whatever, it'll make the phone call for you. That's cool about placing the call. Does it actually talk to the person on the other end, set up times & all that kind of stuff like they showed in the demo? |