In May, we reported that Google Search algorithms lowered the rankings of deepfake sites. Now, Google has gone into more detail on how this algorithm works and how you can manually remove deepfakes from Google Search.
Deepfake Search Ranking Updates
Emma Higham, Product Manager at Google said the search company made "updates to our ranking systems to keep this type of content from appearing high up in Search results." Plus, Google also made it easier to remove deepfakes from Google Search.
Emma Higham wrote, "we are updating our ranking systems for queries where there’s a higher risk of explicit fake content appearing in Search." I mean, it looked like they updated it in May of this year but it seems they made more improvements.
One deepfakes site named Mrdeepfakes.com showed in May an showed a 11% drop in search visible in Google from April to May 2024. But now when I look, it looks more like a 40% drop in search visibility.
Here is the Semrush chart:
Google made a couple of changes to its algorithms around this:
(1) Promote News over deepfakes: Google said they pushed out "ranking updates that will lower explicit fake content for many searches." The example Google gave was "for queries that are specifically seeking this content and include people’s names, we'll aim to surface high-quality, non-explicit content — like relevant news articles — when it’s available." Google said these "updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%."
(2) Deepfakes versus real consensual nude: Google said, "There's also a need to distinguish explicit content that’s real and consensual (like an actor’s nude scenes) from explicit fake content (like deepfakes featuring said actor)."
Google said it is "a technical challenge for search engines" to differentiate between these two types of content but Google added it is "making ongoing improvements to better surface legitimate content and downrank explicit fake content."
(3) Google added how they do this, by looking at patterns. "Generally, if a site has a lot of pages that we've removed from Search under our policies, that's a pretty strong signal that it's not a high-quality site, and we should factor that into how we rank other pages from that site," Google wrote. So Google is demoting sites that have received a high volume of removals for fake explicit imagery. Google added that "this approach has worked well for other types of harmful content, and our testing shows that it will be a valuable way to reduce fake explicit content in search results."
And from the Semrush chart above, in that one case, it seems to be working.
Google also updated its ranking systems removal section in the help documentation.
Before:
Personal information removals: If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites. We may apply similar demotion practices for sites that receive a high volume of doxxing content removals or non-consensual explicit imagery removals.
After:
Personal information removals: If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites. We may apply similar demotion practices for sites that receive a high volume of removals of content involving doxxing content, explicit personal imagery created or shared without consent, or explicit non-consensual fake content.
And Google updated its spam policies removal section:
Before:
Personal information removals: If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites. We may apply similar demotion practices for sites that receive a high volume of doxxing content removals or non-consensual explicit imagery removals.
After:
Personal information removals: If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites. We may apply similar demotion practices for sites that receive a high volume of removals of content involving doxxing content, explicit personal imagery created or shared without consent, or explicit non-consensual fake content.
Manually Remove Deepfakes From Google Search
Google also made it easier to remove deepfakes from Google Search. "When someone successfully requests the removal of explicit non-consensual fake content featuring them from Search, Google’s systems will also aim to filter all explicit results on similar searches about them. In addition, when someone successfully removes an image from Search under our policies, our systems will scan for – and remove – any duplicates of that image that we find," Google wrote.
So Google is not only going to make it easier to request removal but also will filter those results not just for the site it was reported on but also from other sites that have that image.
Forum discussion at X.