IMO, the secret behind Google Search is not the smartness of the algorithms, but how much of it is baked in those O(100ms) which takes Google machines to answer your query. That's why the links above are the true reason Google Search performs well.
Natural Language Processing and Information Retrieval state of the art is far beyond what Google/Bing/Yahoo/Yandex/Baidu employ. But, it's far too expensive to serve it at QPS & latency required for decent UX.
I'd be super interested in a higher quality delayed search. Just each day you could go and look at your search queries from the day before and it would list any better results it found with more time.
Then again maybe i'm underestimating how often I don't know exactly what I'm looking for and just grab the first result.
Hi, would you know if there are any good explanations on whether Google searches (or in general the search strategy) say, creates a general search result for any given query, which is then tweaked with customizations specific to individuals / locations / languages? I.e. they've "saved" the basic search output in advance so that core doesn't have to be run each time, and only adjust around the edges specific to a user?
Or is that not how it's done, and each search, for a given person, follows the same process?
While there's some very short caching of results, to my understanding, there's generally still going to be a lot of hitting the index because there's just a lot of new information coming in all the time. We can't somehow store a set of results for say "cars" and figure it's going to be the same info from one minute to the next.
And results don't really have a lot of personalization for individuals. When you see differences, it's usually due to language and location.