Fundamentals Track
Successful Site Architecture
Learn to successfully architect your site for search engines and how specific page elements and design technologies may impact your ability to gain good organic listings. Covers topics such as directory and file structure, server-side includes (SSIs), 404 error trapping, JavaScript, robots.txt use, frames, secure area usage, and much more. Toward the end of the session, volunteers from the audience will have their sites examined to see how changes could be made to their site architecture and design to increase search engine traffic, as time allows It's highly recommended for those new to search engine marketing to have previously attended the "Search Engine Friendly Design" session on Day 1.
Moderator: Barbara C. Coll, CEO, WebMama.com Inc.
Speakers: Matthew Bailey, President, Site Logic Marketing Barbara C. Coll, CEO, WebMama.com Inc. Derrick Wheeler, Senior Search Strategist, Acxiom Digital
MB = Matt Bailey BC = Barbara Coll DW = Derrick Wheeler
BC launches into the session. Most web design doesn't have SEO in mind. If goal is to produce sales, is to achieve high rankings for the keywords that convert. Your site has to get into search engines, however. Think about SEO throughout the design cycle. Look for opt opportunties. Look for web devs and designers who have seo experience or partners. There is more education coming. Check references on who you hire. Arcitecture inc directory structure, file systems, domains, error handling, redirects urls. Technology is CMS, tracking, dev applications.
How to get the team on board to get an optimized site. Educate the team. Convince the team. Show them stats on landing pages and conversions, for ex. Bribe the team, ha ha. Search engines want your content. Your content not being in there is your fault. How many people use Google sitemaps? Make sure engines can navigate your site. Use their tools. Check what pages are being indexed. Are they following links to my site use www.se.spider.com. www.rexswain.com use as a http viewer. Tell SEs which pages are valid on your site. SE's never forget your site.Visit google webmaster pages for info on how to submit. You must own the site or have access to the site. Google sitemaps tracks history of a site. Will show you content/external link reporting. Yahoo Site Explorer is not as informational but has its pros. You can look at one page's external links.
DW:
Step 1. Master the Basics.Offers a diagram of how a SE works and "sees" your web page. It must find your homepage. Then it must the links to all the other pages. It must grab the content and put that into an index. Shows funny drawing of the process. 1. se crawls site 2. se indexes entire site 3. users perform targeted queires 4. se ranks appropriate pages 5. users click on ranked listings 6. users take action and or interact with the site
Where are you now? What are your domains and subdomains? Identify them. Track your 301,302 pages. Measure the number of pages from your site that are indexed at each engine. Identify sections of pages of your website not indexed. Track your rankings. This tells you need to focus on keywords. Study your traffic. How you link from page to page is vital. It's easy for se's to follow text links. There is a url and a page with keyword file name. That's easy. Navigation in JavaScript is not. There is no path to a URL then. There is mumbo jumbo code like ":mouseover". Prepare a sitemap so SE's can find links. Pop ups links are difficult to follow too. If no alt attribute, for images, engine can't tell what the image is about. Use CSS for text links and dress them up to look like images instead. SE's can handle CSS. SE's can't fill out forms. You have to offer an alternative way of navigating, like a sitemap. He showed an example of a travel form and a page on the travel site that listed the same stuff, only you could find it without a form. IT was with text links.
Footer links are nice in 2 rows of text lnks. You can put nofollow on privacy, copyright, terms, in bottom row. In top row, link to pages you really engines to find. Keep your urls short. Limit the number of parameters. Limit the number of directory levels. Limit total length. In general. Use HTTP status codes, or server response codes to help SE's. Like 200 okay, 301 for a permanent move, 302 temp move file, 404 error is invalid or mistyped urls. SE's will treat 302 like 301's. If in doubt, use 301. SE's will remove the old url and add the new url.as the value/occurrance of each goes up, chances of success go down. Don't stuff long urls with keywords.
The circle of death. Use robots.txt file to prohibit se's. Robots Meta tag for noindex.nofollow. Don't accidently block your site. SE's do not accept cookies. This is an indexing prevention. Homepage redirects. Use breadcrumbs that never change. If they are dynamically generated, se's don't have a direct path. Related products presents complicated paths for users and engines.He showed some good examples. (Hard to describe.) Use robots.txt to prevent spider traps from very long urls. Tracking IDs look like duplicate content to SE's. Use absolute links for https, not relative. Use short, easy to remember names for your domains. No dashes. Page content must have clean code. www.marketleap.com/ses for slides. (I suggest going there for the secton on spider traps. It was complicated and the visuals tell the story better than I could by typing what he said.)
MB:
Takes a different view on SA. He focuses on blindness issues and disabled users. Which just so happen to be like search engines. If you do accessbility, you will likely do well for engines. He talks about the Target lawsuit. Since the web is an extension of the physical store, the blind want the web version to be accessible. They want alt attributres, image maps that have alternative, etc. and Target refuses to do so. The only way you can use the Target form is to push a button with your mouse. If you can't see it, you can't use it. Sometimes you require someone to select an language or country before getting into your site. SE's can't pick languages. FLASH pages offer no information for SE's. A retailer refuses to include alt attributes?
It's image based web based. If you remove the images, nothing is there. All the sale info, calls to action, free shipping, etc. were in images. If you can't see the image, how can you know whats for sale? Engines won't see this either. Do not stuff alt attributes keywords. Alt attributes are meant to describe the image if it can't be seen. Why would you not want to do this for users and engines?
Matt says that people don't read Google and W3C guidelines. If you pass the accessibility checklist, it will pass the SE checklist. They are very similar. Cluttered URLS. These are the dynamically generated ones that are a mile long. You can't find actual products on the Target site because the urls are too busy. Favicons are a great way to brand. It's an icon file. It's free advertising. Re-write cluttered urls. Redirect from old links to new ones. Link. Look for pages that go to old links and 301 or 302 back. Maximize with se friendly urls. Identify top entry pages to the site by url. Those are the pages that are ranking well. Redirects. MSN has instructions on how to handle your redirects for their search engine.
CMS, (Content Management Systems) look for language, flexibility, query strings (rewrite available), unique page titles, meta desc's, server redirects. CSS and standards. Can validated code help you rank higher? No. Using CSS is a benefit, not be the main difference. CSS strips away all the code and lets content be the primary focus of the page. Moves the gook to an external page. This is good for engines. CSS vs tables. Engines try to stack the table. Look at a site in a mobile device to see what stacking looks like. Se's go from top to bottom, next row, then top to bottom. Screen readers do the same thing. Validation assures that spiders index content.
(Sorry for spelling errors...)