| ▲ | tim333 20 hours ago | |
Google now ads an LLM result to about half the search results. I think they've figured out how to do that without too many resources. | ||
| ▲ | parliament32 20 hours ago | parent [-] | |
Google doesn't have to fight context windows. They can cache and store an AI response to a Google query without having to worry about much other than locale etc. You can't do that a dozen messages into an LLM conversation. | ||