An Adam Lasnik post in Google Groups sprung a post at Cre8asite Forums explaining that if you have bad HTML, Google will be OK with it.
Yes, that is the case, your code does not need to be 100% validated or 100% proper syntax. Pretty much, if the code renders on the browser, Google will be able to crawl the page to some extent.
Bill Slawski wrote;
Here's a blog post I wrote a while back about how Google might handle navigation when trying to take large pages and put them on small screens:Google Indentifies Navigation Bars for Small Screens, Snippets, and Indexing
Here's a snippet:
QUOTE The primary focus of this patent is on identifying navigation bars on a page that can safely be re-written or changed in some manner for display on a smaller screen. An integral part of the process involves actually identifying navigation bars. It’s probably important that the patent mentions (briefly) that this identification can be helpful in indexing a page and deciding upon which text to use to provide snippets to searchers, which goes beyond the reauthoring process.
Considering the ways in which search engines may want to manipulate the content of a site, and possibly even rewrite parts of it, I want as much control over the code as possible.
So yes, search engine spiders are forgiving of bad code. But, how much control do you want to turn over to them in their ranking and presentation of your site to others?
So when Adam says; "I'm betting that in the vast majority of cases in which folks have indexing or ranking concerns, the core issue is NOT that their site doesn't perfectly validate," I would nod my head and move on.
Forum discussion at Cre8asite Forums and Google Groups.