Web pages or product listings stored in a database or a dynamic page assembly system can be "invisible" to crawler-based search engines. Discover solutions to this problem and the other unique issues that
need to be considered by those running dynamic web sites. In addition, discover why dynamic sites needn't be a problem but a benefit when dealing with search engines.
This session is moderated by
Detlev Johnson of Position Technologies
with
Laura Thieme of Bizresearch,
Mikkel deMib Svendsen of deMib.com and
Jake Baillie of STN Labs
Mikkel gets things rolling by discussing some of the problems and solutions for
dynamic sites. The problem is access. Engines may have problem indexing pages.
He talks about the IRTA model - Index, Ranking, Traffic and Actions. Simplify
technology to make it easier for users and spider. He talks about a virtual
bridge between users and spiders. An example of this is dealing with the
complexity of dynamic URLs by using a mod rewrite.
Things that are not a problem -
- That you store your information in database
- Question marks - this does indicate to engines that the web site uses a
template
- SSI (server side includes)
- File extension
Mikkel points out that there is an infinite number of indexing problems.
- Long URLs
- Duplicate content - session ids, time stamped URLs, etc.
- Technologies - AJAX, etc.
- Spider traps
- Server downtime or slow responses
Indirect issues include -
- Support of cookies, JavaScript and Flash.
- Geo targeting and personalization
- Form (post method) navigation
There are multiple solutions to any problem. One example is mod rewrite where
you take multiple parameters in a URL for example and rewrite it so that it is
one parameter. He also says you do not need a dynamic site especially if you
have a half dozen pages or so. Another option is to have dynamic elements in
static sites.
Laura is up next. When dealing with a dynamic site, the first thing to look at
is URL structure and see how well pages are indexed in search engines. Look at
current rankings as well. Ultimately overcome technology, resource or political
issues.
She shows us an example of such an easy fix - home page titles and uses Peir1
imports as an example. She suggests targeting some of the most popular terms.
She also points out that the site had extremely long URLs but only Google was
having issues indexing them. Another very easy fix is optimizing category
titles. She suggests placing several keywords or variations in title tag in
order of importance. All the things that can be optimized on static sites,
can and should be optimized on dynamic sites as well.
She shows several examples of large e-commerce sites who neglected to optimize
title tags. She warns to watch out for CMS and ecommerce solutions that are not
search engine friendly. The speed that changes are implemented through MSN may
indicate future success for Google. Also consider optimizing a data feed for
Yahoo! Use 301s when changing pages.
Finally Jake steps up to the podium. First thing Jake points out that dynamic
sites are not a problem. They haven't been for five years now. SEO is more of a
design philosophy as opposed to changing little things here and there. He points
out that there is not requirement that a URL actually point to a file so there
is a lot of fun things you can do with dynamic sites.
First thing Jake talks about is using dynamic 800 numbers to track conversions.
He uses the example of JustFlowers.com who uses the same 800 numbers across
multiple ads. Dynamic sites can assign dynamic numbers on the fly. By doing
this, one can track how ads are actually converting.
He then talks about serving different content to different users. USA Today for
example will deliver a nicely formatted web page for mobile phones even though
you may be using the same URL on the phone as you would in a web browser on a
PC.
Another fun thing you can do with dynamic sites is to use cookies and
sub-domains.
Mine failed search results from your logs. People mis-spell all the time and
sending them to "no results found" pages will create a good user experience.
Learn what people are searching far where they get no results, and adjust your
web strategy.
Use mod rewrite to switch out images from those who steal them from you.
I decided to hang around and try to recap some of the Q&A.
Q: Do user and spiders see SSI differently?
A: If engines can see that you are including files, then you have a security
problem. In other words, they only see the final product, not the raw html. The
spider is just like a browser.
Q: How do you check to see if your results are in supplemental index?
A: Spider your own web site and then export into spreadsheet in which case you
can see titles. Then if you see repeat titles, you know you might have a
supplemental index problem.
Q: If client is shopping for search engine friendly e-commerce platform, what do
they look for?
A: Ability to be indexed. Check other sites using that software and see if there
are indexing problems - duplicate content, not indexing, etc. Secondly look for
how much flexibility you have in changing site.
It was difficult to actually write down anymore from this session as quite a few
things Mikkel and Jake talked about, you have to just "be there" to really get
it.
David Wallace - CEO and Founder SearchRank