SEOmoz's CEO, Rand Fishkin, announced on his Google+ page that his company is working on project that they hope will "classifying, identifying and removing/limiting link juice passed from sites/pages."
In short, SEOmoz is working on software to add to their toolset to help SEOs determine if their pages or competitors pages are spammy in nature. If SEOmoz can figure it out, the purpose would be to tell webmasters and SEOs that Google probably figured it out and thus the links and content on those pages classified as spammy are probably not worth much.
Rand explained that if they classify it as spam, "we're pretty sure Google would call webspam."
One thing Rand gets himself in trouble with, or has gotten in trouble with is "outing" SEOs for spamming. He has tarnished his reputation amongst some SEOs in doing so, either intentionally or unintentionally.
That being said, Rand asked the public if he should go ahead with this project. He wrote:
Some of our team members, though, do have concerns about whether SEOs will be angry that we're "exposing" spam. My feeling is that it's better to have the knowledge out there (and that anything we can catch, Google/Bing can surely better catch and discount) then to keep it hidden. I'm also hopeful this can help a lot of marketers who are trying to decide whether to acquire certain links or who have to dig themselves out of a penalty (or reverse what might have caused it).
It seems like Rand has made up his mind and most of the responses are in favor of SEOmoz building this out. What do you think?
Forum discussion at Google+.
Image credit to ShutterStock for spam target