Last week, during the AMA with Google at SMX West, Google's Gary Illyes said he doesn't see machine learning and artificial intelligence taking over most of the Google ranking algorithms because it is too hard to debug. He said, yes, they have elements of machine learning, such as RankBrain, and other things (he wouldn't say what other things). But he also said that he doesn't seem machine learning completely taking over all the ranking algorithms.
Debugging a machine learning process is too hard because machines do the work themselves. If someone hand codes an algorithm, it is way easier to debug issues with that.
Here are some tweets from the event when he said this:
Going all AI for Google's search algos would be a bad decision because it is very difficult to debug, says @methode #google #smx
— Dustin Woodard (@webconnoisseur) March 23, 2017
If RankBrain makes a ranking decision, it is hard to understand why the decision was made. With traditional algo, you can. @methode #smx
— Jennifer Slegg (@jenstar) March 23, 2017
Probably wouldn't want an entirely ML algo and ditch the regular one. ML algos are very hard or impossible to debug. @methode #smx
— Jennifer Slegg (@jenstar) March 23, 2017
This was not the first time Gary Illyes said this, he said it back in October 2016 as well:
Going all-in on machine learning is out of question for Google coz it's impossible to debug what's happening in SERPs @methode #digitalzone pic.twitter.com/pgsJSByp6I
— Kevin Richard (@512banque) October 21, 2016
I know Googlers have been saying this for a long long time - but one day - maybe, the Google algorithm will be run completely by machines. :)
In terms of what else in the algorithm is run by machines, outside of RankBrain? We know Google Photos uses ML, we know Google Translate does, some of YouTube, but what else in core search? We don't fully know yet.
Forum discussion at Twitter.