Google's John Mueller said in a Google Hangout yesterday that if you deploy an on-hover event that will turn text on your web page into links, GoogleBot most likely won't see those as links. Why? GoogleBot won't be using it's mouse to scroll over the text on your page and thus, most likely won't trigger the on-hover event.
The question came up in the hangout at the 35:30 mark into the video where someone asked if they use jQuery to hide links from content stealers but wanted GoogleBot to see them, would it work.
The question:
We’ve tested reversed links with JQuery, would this be considered cloaking. Since when the link is hovered over by a real visitor, the real link is revealed.
John's response:
Ah, okay, so kind of like, on hover and then it turns into a link.What would happen in a case like that is that we would probably not pick up those links. Because GoogleBot isn’t going to hover over every part of the page. It will pull out the page, render it once, like a browser, it is not going to interact with the page to see what is actually going to happen when you do physical things.
If you need those links to be found by GoogleBot, then make sure we can find them when we load the page. If you just want to make them available for users, then sure, I think that might be an option. I think in most cases you wouldn’t want to do this. And if you are having problems with scrapers than I’d try to find something different to kind of attack that more directly then to try to obfuscate the links like this, which could end up causing more problems for your web site in search, then the scrapers anyway.
So it seems you should not take this approach.
Here is the video embed:
Forum discussion at Google+.