Search Engine Friendly Design

Mar 18, 2008 - 6:05 pm 5 by

How can you build a web site from the ground up that pleases both crawler-based search engines and your visitors? Discover how "search engine friendly" design can tap into free traffic from search engines. This session is especially suited for beginners who need an overview of important design issues to keep in mind.

Moderator:

Mark Jackson, Search Engine Watch Expert and President and CEO, VIZION Interactive

Speakers:

Eric Papczun, Director of Natural Search, Performics Matthew Bailey, President, SiteLogic Craig Hordlow, Chief Search Strategist, Red Bricks Media

Contributed by Sheara Wilensky of Promediacorp.

----------

Mark Jackson:

Wanted to start of with a couple questions – how many people here would describe primary role as web designer? Marketing? Just SEO’s? I think we’ll have a good amount of information here to digest, as well as some entertainment value.

Here is Eric Papczun:

Eric:

How many people are new to SEO? OK, good, we have a little bit for everybody here. Let’s first understand the basics of search friendly design, which covers indexation, making the crawlers able to find the great relevant content you all have. Some of us forget sometimes that the job of crawlers is extremely difficult and we don’t make it easy sometimes. They have a one size fits all tool that needs to understand each website, but the reality is all websites are unique, life snowflakes.

So we will also talk about content and making it clear and organized so the engines understand it, as well as some of the friendlier design techniques.

The first things we want to do is get crawled and indexed. Think of each website as its own town. You want to make sure it’s all compliant and within code. So take a look at the Google webmaster guidelines which is a great place to start. Also if you are building a new site the best thing you can do is find a link to a site that has already been crawled and link it back to you. For the designers, I urge you to go out there and read about sitemaps and robot inclusion files and things related to that.

First I want to go back to the metaphor of a town. We want to make sure it’s easy for us to navigate and understand where things are, a good address system to follow, so we can find any geographical location within each town. So the big test is when you create a URL, is it unique, meaning there is no other page that has the same URL, and there is no page that shares two URLs. A lot of sites have this problem and make it difficult for the crawlers.

Parameters are variable, not constant. When you get domains that start passing dynamic variables through the URLs, there’s a chance the URLs would change depending on how you navigate to that page. If you are tracking visitors on your site, you don’t want to necessarily track it through the URLs because that could create problems. Also, no session IDs!

Flatten the folder structure. Most important directories and pages should be as close to the center as possible. Whatever the first or second layers, are easier for the search engine’s to find and rank for content. When you get 5, 6, 7, 8 directory levels down it gets harder for the search engine’s to find that content.

We also recommend – don’t let URLs die. If you do a redesign, instead do a 301 redirect which is a “change of address” form for the engines.

We also want crawlable navigation. The infrastructure – make it easy for folks to get from A to B, not just in one way, but let them have multiple paths. The better the navigation, the more success you will have with the crawlers. I love breadcrumb navigation. It makes it clear to humans and to engines where you’ve been.

We love global footers. We recommend you put your main sections of your site that are most important to you and to users.

Avoid basic mistakes like duplicate content, dead-end and orphan pages. Duplicate content is difficult for the search engines and your page can get dropped.

Build the most robust and comprehensive site map which is a wonderful way for content to be found. Read more about it online, I highly recommend it, if your infrastructure fails, at least they have a file they can go to that lists all the URLs.

Robot.txt – gives instructions to the search engines when they come to your site. They are rules saying what can and can’t be crawled.

One of the greatest things is that Google has provided us with a reporting dashboard to show how crawlers are seeing your site. If you haven’t done this I would strongly recommend doing so.

Content:

Let’s assume now we’ve done everything right. Now we want to make sure we are optimizing the good relevant context. Put your pages to the test – do I have clarity about what this page represents? Look at the title and meta data. If it doesn’t have a focus to you then it doesn’t have a focus to your reader.

Alignment: once you distill what that landing page represents, make sure you put that description in your title tag to show how your page is unique, then make sure the design is neat and consistent.

Make sure when you do your title and meta tags, that you not only think about ranking, but also when someone goes to a search engine results page, you want yours to stand out. Ask yourself how are you different from all these other listings that might show up for a keyword.

Tips:

I love flash, javascript and ajax, but we have to use these things judiciously. We love lakes and rivers, they are beautiful, but we don’t want to flood out city! Surround them with HTML.

Next up is Matt Bailey of Site Logic:

Search friendly design is one of the most critical parts of building a site and SEO. You can have the best staff of SEO, bit if ain’t crawlable it won’t matter! The first we look at is the architecture and the programming. You gotta fix the foundation before you fix the house.

It’s important to allow access to your site to EVERYBODY.

I wan tom talk about target,com who is being sued because they don’t have a good site for the blind!!! 1) lack of alt tags. That’s all! That’s the number one thing people are asking for!! And they won’t do it!!! 2) Image maps without alt texts 3) You need a mouse to navigate their site.

So this site is not only blocked to users, but to search engines too because search engines see the same as the blind!

Google wants your website in their database. The more info they have the better info they can present to users, who are their customers. So their guidelines tell you how to use it! So if you have an artist who is being difficult, tell their boss to make them read Google guidelines! Who is right?

Search for the web accessibility checklist by W3C and see how your website measures up. It’s almost point for point the same as the Google guidelines. Why? Because search engines can’t see, hear, click – they are the most disabled thing that will come to your website!!!

My definition of accessibility is that your site should be available to anyone, anywhere, anytime.

Target – the alt attribute is necessary because if you have info inside an image, the search engine won’t know what it is! Look at what’s contained in the image – the sales info! If you can’t see it you won’t know where to click to get the sale price! So Target has gone to extreme lengths to not let people use their website!

There are other sites that make a user select a country or a language on the home page in order to proceed on the site – from a drop down box – and that stops the search engines in their tracks!

Cluttered URLs – before you start an SEO campaign, look at the URL! My rule of thumb is if the URL is longer than the bar…you need to do something! Not only shorter, but use a favicon! It’s a great way to brand your website! It’s just a little graphic icon that appears next to the URL which makes it easy for branding and bookmarking! Do it.

Education on CSS –

I love it. Because it puts the emphasis on the content rather than the markup - all that geek stuff goes in an extra file. So now the search engines don’t have to parse through the code, they can just get the content.

Linearization – search engines read from left to right and tables go top to bottom so it’s a mess for the search engines. If you use a table-based layout you will get a chunky website in a mobile device. CSS is great b/c it’s cross platform.

Check out CNET – they have gone completely CSS – no smushed together table structure and the search engines are reading the content exactly as it appears on a page, and looks beautiful on a mobile device.

Validation – helps find mistakes, tags that are left opened etc. fix them and the rankings will go up by the search engines will read the site properly.

Architecture (breaks down navigation of gobreck.com)

How do I get where I want to go? Make sure the path of information is consistent for the search engine and the searcher. This site is a mess.

RapidCity South Dakota – they have a good navigation structure, breadcrumb navigation which lets us know where we are. We know where we are because the page is highlighted and there is an arrow. We can rely on the navigation structure which is crawlable and keyword rich so it’s great for users and search engines. Don’t ever ever change your navigation midway through the site, keep consistent throughout.

---------

Next up is Craig Hordlow.

An intervention happened to me because I had been sending a lot of negative emails to the creative services team at my company telling them flash has no value at all. So I stand here now before you as a recovering flash hater. I was cured by learning about the workarounds! It’s all about exceptions.

Why did I end up in an intervention? There are two audiences on your website. Search engines and humans. But we also have 2 producers - the designer and the SEOs. You typically hear an SEO say don’t use flash and avoid it, and designers say use it, it’s cool.

Rich Intenet Applications (RIAs) are here to stay. You need to be prepared to deal with them. Last year Steve Jobs and Bill Gates were interviewed at a conference, and Bill Gates said about RIAs – “We’ll look back at this as one of the great periods of invention”. So SEOs can’t keep saying don’t use flash and emerging RIAs!

I spoke at an adobe conference about RIAs and SEOs. I reviewed techniques to accommodate rich internet applications. Here is one I came up with that I call my CSS Silver Bullet. Ironically the first time I implemented this was on Adobe’s website. If you look at the Creative Suite 3 homepage you will see tabs, navigation elements that if you were to click it would lead to a whole new page, but the URL doesn’t change. A first reaction by an SEO is – that’s bad, we need a unique URL on every page! Well, yes, but no. You can have a page that is largely in flash, but the flash needs to be embedded. The workaround is that there is a tab you can click that can have a degree of visibility that does not compete with flash, or the SEO content just unravels when that is clicked. All you need is one little tab to maintain the SEO integrity of a flash page. We are using JavaScript to toggle through that search engine friendly content. The content is typically kept in CSS. Unless the user hovers or clicks, the user does not see the content. So how is this CSS table thing useful? All you ask of the designer is to create one SEO tab!

So what is the objective? To let the RIA developer or designer have their freedom.

However, there are some considerations:

- Do not abuse this technique. - Never make the trigger invisible - Have integrity about the content you place in your tabbed area.

Developers are way ahead of the search engines’ ability to index this content. So it beckons the question – when the hell are the search engines going to catch up?

I think we need the text-based sites for another 5 years or more. Google will rely increasingly on its Google websmaster tools to handle issues in general. The engines will eventually (way in the future) rely on a hybrid model of personalization, click-metrics, RIA indexing, and text-base indexing.

Coverage was provided by Sheara Wilensky of Promedia Corp.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google

Google Kwanzaa Decorations Are Live For 2024

Dec 21, 2024 - 9:18 pm
Google

Google Christmas Decorations Are Live For 2024

Dec 21, 2024 - 6:55 pm
Search Forum Recap

Daily Search Forum Recap: December 20, 2024

Dec 20, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google December Core Update Done, Spam Update Starts, Google Ranking Exploit Leaked, Google Tests Double Serving Ads

Dec 20, 2024 - 8:01 am
Google Updates

Google December 2024 Spam Update 👾 Rollout Shocks Before Holidays

Dec 20, 2024 - 7:51 am
Google

Google Testing Shaded Button Sitelinks On Mobile

Dec 20, 2024 - 7:41 am
Previous Story: Earning Money from Contextual Ads