There is a massive thread in the Google Web Search Help forums that started about a week ago with complaints about how Google's search results are biased and racist.
This specifically compares a search in Google Image search for [sexy white mom] vs [sexy black mom]. The search for [sexy white mom] returns clothed white women whereas the search for [sexy black mom] returns naked black women, many of which are performing sexual acts.
The question is why is Google showing such different results for a similar query?
The response marked with the "best answer" in the thread is by a top contributor (not an official Google rep) who said:
Google Images are indexed based on the way images are labelled on the publishing website and/or surrounding text on the webpage. Google can't (yet) look at an image and recognize the content visually. Any apparent bias in the image results is caused entirely by the publishers of the images, with no intervention by Google.
There are over 400 posts in that thread, with many other complaint threads in the forums. Not one Googler has responded since it has been posted.
Of course, this is just an algorithm, but the algorithm is showing pretty shockingly different types of results based on a very similar query.
Forum discussion at Google Web Search Help.