There is a very distinctive focus in the current search engine algos towards AI. Google is the most obvious in the application of such technologies, but is not the only one. Some of the things search engines are utilizing in their machine learning process includes studying user behaviour, recording actions, drawing conclusions.
The more traditional ways that search engines have used previously to establish the populatiy of a given site, have been among others: volumen of inbound links, and their quality score based on relevancy of their anchor text and co-relation to the theme of the target sites. There are several factors involved in evaluating the links data that helps search engines to determine the popularity of a site and the importance of its content. In this new approach, the engines have been able to gain a finite amount of aggregate usage data from its various programs and services that allows them to further develop algos into more accurate and relevant services.
Nowadays, geotargeting or the personalisation of search results based on the geographic location of the users, browser-based and social bookmarking are just a few new elements added to the equation. Traffic source information to given search results also count. This allows searh engines to really evaluate sites based on how popular they truly are, and weight their content and position in SERPs.
http://www.website-articles.net/Article/Search-Engines-and-Artificial-Intelligence/9777