On Friday, Google published a How Google autocomplete works in Search but in that post, as I dug into at Search Engine Land, Google announced they are expanding the types of predictions they remove from autocomplete.
If you've been following Danny Sullivan on Twitter, you know he has been responding to a ton of complaints about how and what Google shows as autocomplete predictions for completing your search query as you type. This post from Google not only explains how Google's autocomplete works and how it isn't perfect, but shows Google is making changes to the type of autocompletes that Google would remove going forward.
In addition to Google removing these sorts of autocompletes:
- Sexually explicit predictions that are not related to medical, scientific or sex education topics.
- Hateful predictions against groups and individuals on the basis of race, religion or several other demographics.
- Violent predictions.
- Dangerous and harmful activity in predictions.
- Spam.
- Closely associated with piracy.
- In response to valid legal requests.
Google is adding removing predictions that are "perceived as hateful or prejudiced toward individuals and groups, without particular demographics." Prior it had to relate towards "race, ethnic origin, religion, disability, gender, age, nationality, veteran status, sexual orientation or gender identity," Google said - that is no longer the case.
Here is a video showcasing how it works:
Here are some insane stats about autocomplete:
- On average, it reduces typing by about 25 percent
- Cumulatively, we estimate it saves over 200 years of typing time per day. Yes, per day!
The latest in our behind-the-scenes series looks at how autocomplete creates predictions to speed up Google searches, how inappropriate predictions are dealt with and how information such as weather and sports scores may appear as you type. https://t.co/ZZeI2EMTdv
— Google SearchLiaison (@searchliaison) April 20, 2018
Forum discussion at Twitter.