I know computers and algorithms are funny but clearly, when you search for [wank] in Google Images and set the SafeSearch filter to moderate, no or little pornographic or nude content is coming up. But when you up the SafeSearch filter to the strict setting, then plenty of nude and pornographic content comes up.
If the lesser version of SafeSearch is blocking it, what is wrong with the stricter version of the filter?
Here is a picture of the current results under moderate SafeSearch:
Here is a picture of just changing it from moderate to strict:
Of course, I blurred out the nude ones. Keep in mind, there are plenty more when you scroll down.
Like I said, I guess this is just a weird quirk in Google Images SafeSearch algorithm and Google will fix it. But just weird nevertheless.
A system administrator said he runs 700 computers and he noticed this while doing some tests. He posted the issue in the Google Web Search Help forums and wrote:
We have about 700 computers and our internal policies ensure that Google settings for safe search are set to “Strict”. With this we normally have no issues with people accessing pornography. During normal testing we have found that searching for the word “wank” provides very inappropriate images that should not be displayed with safe search in use. I tried phoning Google with no luck. Does anyone know how to contact someone who can assist with this?
Here are some other stories similar to this we covered in the past:
- Porn on Google Image Search with Strict Search On
- Google Search By Image Thinks I'm A Porn Star
- Google Recommends Reporting Mass Porn In Forums
- Google's Porn Issue With Children Related Keywords
- Pirelli Tires? Nope, Pirelli Porn In Google
- Google Background Image Of Naked Women
- Very Explicit Porn Hits Google Universal Search
Forum discussion at Google Web Search Help.