Diagnosing Web Site Architecture Issues

Jun 4, 2008 - 6:37 pm 0 by

Session Intro: Provides a checklist and workflow for diagnosing your web sites for SEO obstacles using freely available diagnostic tools.

Vanessa Fox, Features Editor at Search Engine Land is moderating this session and speakers include David Golightly, User Experience Developer Lead at Zillow, Jonathan Hochman, President of Hochman Consultants and Chris Silver Smith, Lead Strategist at GravityStream / Netconcepts.

Vanessa starts off with the question of "what really matters?" Accessibility, discoverability and conversions are the big stuff that matters. The place to really start in investigating potential problems is the search results themselves. Is problem related to indexing, ranking or crawling? Identify problem first.

Chris is then introduced as first speaker. Diagnosing problems involves a wide range of criteria. Most issues are basic and easy to diagnose - things like mis-using robots.txt tag, inadvertently blocking spiders and the like. First question to ask is "are pages actually indexed?" If not, there is a problem! Does site URLs have sessionIDs and if so, are they doing anything to resolve that? Google Webmaster Central is useful to get a bird's eye view of what Googlebot sees when visiting your site. The title tag and meta tag data they show can be quite revealing and interesting.

Chris uses www.web-snifer.net to check server header status codes to ensure pages are reporting the proper codes. He uses www.seebot.org to view web pages like a search engine would. You can also use a Firefox Developer toolbar as well. Firefox Link Counter Extension provides useful link data. Chris makes reference to SEOmoz's SEO Toolbox as a good source of tools for diagnosing problems. One such tool shows other sites that are resting on your IP address. Use Google Sets to ID your competitors. You can also see sites that Google thinks are related to yours.

Jonathan is up next who will dive into some diagnostic stuff. Using a NoScripts add-on, Jonathan can turn scripts off and discover problems. He shows a problem with the SMX site when scripts are turned off as well as Gillette. He recommends the Googlebar (not Toolbar) which is a Firefox plug-in. It has a one-click button that shows you Google's cache. Another Firefox add-on he mentions is Live HTTP Headers which shows header status codes.

With regards to rich media applications, you need to be able to feed the bots content they can understand. Replace html content with rich media content by manipulating Document Object Model (DOM). For Silverlight, create SEO-friendly insertion code or  better yet, bug Microsoft to provide a solution.

Xenu's Link Sleuth will crawl href links just as a search engine bot would. It makes it easy to find and fix broken links. It will also help you to create a site map. Firefox Web Developer Add-on has multiple functions that are valuable.

Watch out for problems with frames, iframes, Flash and Silverlight. Each object is treated as a separate thing and not as a part of the host page. Ajax as well can be problematic and it may use iFrames frequently.

Finally David is up to show some problems they found at Zillow. One problem they had with old database is that it was not highly configurable for multiple data sets. They also wanted it to be responsive to a wide range of user options. One problem they had was that many of the functions Zillow had were dependant on Java Script. With JS turned off, users (and search engines) were not able to use site. As of 2/08 only 200,000 out of 80,000,000 homes were actually indexed. They also did not rank well.

They also improved navigation to help bots find pages. Using breadcrumbs, they help bots find very interior pages. Ajax on the top, not the bottom. In other words, AJAX should be built on top of functioning web page and not other way around. SEO should work in concert with great UX.

Q&A:

Here is a recap of "some" (not all) of the questions that were asked and answered.

  1. Is there any automated tools that check to see that redirects are working correctly after they have been set up?

    Jonathan recommends Xenu's Link Sleuth and Vanessa recommends using Google Webmaster Central.

  2. What are nofollows for?

    Chris answers to control flow of PageRank (PageRank sculpting) and to stop flow of link juice to undesirable pages. Also to control blog comment spam.

  3. Does private WHOIS put you at disadvantage?

    Chris says probably not. Vanessa says so long as you are not spamming, should not be a problem.

  4. To optimize or not optimize pop-ups?

    Do you want it as separate page? It may not be good entry page. If JavaScript, is not going to be indexable anyway.



Session coverage by David Wallace - CEO and Founder SearchRank.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 20, 2024

Dec 20, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google December Core Update Done, Spam Update Starts, Google Ranking Exploit Leaked, Google Tests Double Serving Ads

Dec 20, 2024 - 8:01 am
Google Updates

Google December 2024 Spam Update 👾 Rollout Shocks Before Holidays

Dec 20, 2024 - 7:51 am
Google

Google Testing Shaded Button Sitelinks On Mobile

Dec 20, 2024 - 7:41 am
Google

Google Search To Gain AI Mode

Dec 20, 2024 - 7:31 am
Google Maps

Google Tests Nearby Hotels & Restaurants In Business Profile Listing

Dec 20, 2024 - 7:21 am
Previous Story: Bid Management Today