At Pubcon, Matt Cutts was asked about reciprocal linking. We already explored the reciprocal linking question with Adam Lasnik who said that reciprocal linking with relevant sites is better than reciprocal linking with every possible site on the Internet.
A WebmasterWorld thread quotes my coverage of the Matt Cutts keynote (shout out to cnvi for reading!) where I pretty much cover the question about reciprocal linking:
Q: People are all about links but then there's a concern about linking to bad neighborhoods. How do you identify bad neighborhoods? Should you nofollow them or stay away totally?Matt: Use your gut. Trading links is natural and it's natural to have reciprocal links. At some level, natural reciprocal links happen, but if you do it way too often, it looks artificial. My advice is to go with your gut and if you're worried, you can use nofollow.
Yup, that's what I said. Then I talked about birds. (Actually, Matt did. See coverage here.) In any event, martinibuster expounds upon the statement by Matt to say what we pretty much already know. Matt isn't saying anything different with regards to reciprocal links. He's just saying that you need to be careful with your reciprocal linking approach:
1. Heavy reciprocal linking won't pass a hand check.2. Light reciprocal linking, both as naturally occurs between similar sites, and apparently the unnatural kind may pass a hand check if it's light.
3. The limits of reciprocal linking are purposely left ambiguous which means either there is no clear number by policy or algo to how many recips you can have, it is hard to put a number to it because it occurs naturally, and/or the limits are left to the judgement of individual hand checkers, i.e. Google reserves editorial discretion when performing a hand check.
That makes complete sense. Be careful in your strategy if you really are inclined to do reciprocal linking.
Forum discussion continues at WebmasterWorld.