| ▲ | jillesvangurp 7 hours ago | |
Because it's not a simple problem space. Lucene has gone through about three decades of lots of optimization, feature development, and performance tuning. A lot of brain power goes into that. Google bootstrapped the AI revolution as a side effect of figuring out how to do search better. They started by hiring a lot of expert researchers that then got busy iterating on interesting search engine adjacent problems (like figuring out synonyms, translations, etc.). In the process they got into running neural networks at scale, figuring out how to leverage GPUs and eventually building their own TPUs. The Acquire Podcast recently did a great job of outlining the history of Google & Alphabet. Doing search properly at scale mainly requires a lot of infrastructure. And that's Google's real moat. They get to pay for all that with an advertising money printing machine. Which BTW. leverages a lot of search algorithms. Matching advertisements to content is a search problem. Google just got really good at that. That's what finances all the innovation in this space from deep learning to TPUs. Being able to throw a few hundred million at running some experiments is what makes the difference here. | ||