Friday we reported that Google to Begin Treating Subdomains as Folders: Max 2 Results Per Search based on a post from Tedster at WebmasterWorld.
Soon after, Matt Cutts of Google commented saying:
This isn't a correct characterization of what Google is looking at doing. What I was trying to say is that in some circumstances, Google may move closer to treating subdomains as we do with subdirectories. I'll talk about this more at some point after I get back from PubCon.
But Matt didn't offer us more hints as to what was meant by this in our comments area. So I swung back to the WebmasterWorld thread and saw that Matt spoke more with Tedster on the topic, where Tedster explained:
This change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".So this change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".
Matt also did a video interview with Michael McDonald of WebProNews this afternoon, where he planned to bring more clarity to this issue. When that video goes live, we'll have even more direct information.
But just now, Matt posted subdomains and subdirectories at his personal blog explaining it all.
Matt explained they use something called "host crowding," a method Google used to show up to "two results from each hostname/subdomain of a domain name." Matt said Google has already changed the likelihood that Google would show more than two results from the same hostname for the same search, this was done already in the "last few weeks." For the most part, this change went unnoticed, until Matt said something to Tedster - which is why Matt needed to clarify. Matt explained:
This change doesn't apply across the board; if a particular domain is really relevant, we may still return several results from that domain. For example, with a search query like [ibm] the user probably likes/wants to see several results from ibm.com. Note that this is a pretty subtle change, and it doesn't affect a majority of our queries. In fact, this change has been live for a couple weeks or so now and no one noticed.
So, all in all, this change is extremely small and was not as big as I originally thought.
Has anyone seen a change in how Google ranks their sub-domains for "ego queries"?
Forum discussion at WebmasterWorld.