Live: CSS, AJAX, Web 2.0 & SEO

Oct 7, 2009 - 1:50 pm 2 by
Filed Under SMX East 2009

Below is live coverage of the CSS, AJAX, Web 2.0 & SEO from the SMX East 2009 conference.

This coverage is provided by Sheara Wilensky of Promediacorp & Brian Ussery - Beu Blog.

We are using a live blogging tool to provide the real time coverage, please excuse any typos. You can also interact with us and while we are live blogging, so feel free to ask us questions as we blog. We will publish the archive below after the session is completed.

CSS, AJAX, Web 2.0 & SEO(10/07/2009) 
2:00 Sheara Wilensky:  

CSS, AJAX, Web 2.0 & SEO – This session looks CSS, AJAX and Web 2.0 dynamic design techniques that can cause search engine indexing and ranking issues, with solutions to consider. Programmed by Jane & Robot.

Moderator: Vanessa Fox, Contributing Editor, Search Engine Land

Speakers:

Benj Arriola, SEO Engineer, BusinessOnLine
Richard Chavez, SEO Manager, iCrossing
Bruce Johnson, TBA, Google
Kathrin Probst, TBA, Google
2:02
Expand
2:04 Sheara Wilensky:  Vanessa Fox: This might be the most technical session that we have.
2:05 Sheara Wilensky:  How many of you use Ajax on your site and are looking for good SEO answers?
2:05 Sheara Wilensky:  How many are looking to replace it?
2:06 Sheara Wilensky:  Richard Chavez from iCrossing is up first.
2:06 Sheara Wilensky:  Richard: What is AJAX? What are the SEO challenges with AJAX?
2:06 Sheara Wilensky:  Let's get started.
2:06 Sheara Wilensky:  [talks about his company, iCrossing]
2:07 Sheara Wilensky:  So, what is AJAX. Asynchronous JavaScript and XML. Used on the client side to create interactive web apps or rich internet apps.
2:07 Sheara Wilensky:  Some of the challenges of AJAX:
2:07 Sheara Wilensky:  1) Uses extensive amounts of JavaScript, so bots have difficulty getting through this.
2:08 Sheara Wilensky:  Google does a pretty good job, but Yahoo and Bing have difficulty.
2:09 Sheara Wilensky:  2) Lack of client-side or ON PAGE content. Content is stored server side so is not necessarilty on the page itself. Content is rendered via the AJAX engine.
2:09
Expand
2:09 Sheara Wilensky:  3) Stop Crawl parameters: Bots do not typially pars data past the "#" tag, data past the "#" tag is ignored.
2:10 Sheara Wilensky:  4) All content is rendered under one URL, mixed content themes dilute kw relevancy.
2:11 Sheara Wilensky:  Tactical Suggestions:
  • Create identical alterntive content- leverage apps such as SWFobject to render SEO-friendly version of URL.
2:12 Sheara Wilensky:  JavaScript navigaition: Alter JS nav to remove any commands with thing the URL quotes. Reference JS control externally and call file via class or ID attribute.
2:14 Brian Ussery:  Richard suggests XML sitemaps, footer navigation, supporting (X)HTML and submiting sitemaps via Google Webmaster Tools.
2:14 Sheara Wilensky:  Additional suggestions:
  • Create crawlable paths such as sitemap page, footer nav, tiered sitemap structure.
  • Supporting HTML content.
  • XML Sitemap file.
Whatever method you choose, make sure you have unique content displayed.
2:16 Sheara Wilensky:  SEO Tactics Deployed: (example with Wachovia branch location on a Google map)
  • Individual location URLsisolatd from JS controls.
  • Tiered sitemap structure linked from homepage.
2:16 [Comment From Barry Schwartz]
Google's official announcement at http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
2:16 [Comment From Michael Martinez]
Google Webmaster Tools Team just proposed a standard to make AJAX crawlable. Cf. http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
2:17 Sheara Wilensky:  Discusses use of SWFobject and hash tag.
2:17 Brian Ussery:  For technical folks, "hash tag" = fragment #anchor
2:18 Sheara Wilensky:  Giving a case study.
2:19
Expand
2:19 Sheara Wilensky:  Some key takeaways:
2:19 Sheara Wilensky:  Make sure your onpage content is visible, tiered site structure, make sure all your technical problems are solved up front, ensure URLs are crawlable.
2:20 Sheara Wilensky:  I highly rec. getting SEO intergrated in the building, it's easy than implementing it after.
2:20 Sheara Wilensky:  Thank you!
2:21 Sheara Wilensky:  Vanessa: Some of the resources can be found on janeandrobot.com/resources
2:21 Brian Ussery:  Vanessa points out resources at http://www.janeandrobot.com/resources.
2:22 Brian Ussery:  
Up next:
Bruce Johnson, Google Web Toolkit Lead Google Atlanta, GA

2:22 Sheara Wilensky:   Next up is Bruce Johnson, Engineering Manager at Google.
2:23 [Comment From Montana]
I have a question about hiding content with CSS or JS and its effect on SEO.
2:23 Sheara Wilensky:  Bruce: We work on developer tools at Google, giving developers the ability to create very sophisticated AJAX applications.
2:24 Sheara Wilensky:  Very JS heavy.
2:24 Sheara Wilensky:  Again, this is somewhat redundant, but WEB CRAWLERS dont always see what the users see.
2:24 Sheara Wilensky:  JS produces dynamic content that is not seen by crawlers.
2:25 Sheara Wilensky:  Example: A Google Web Toolkit app that looks like this to a user [image], only looks like this to a web crawler.
2:25 Sheara Wilensky:  Why does this prob need to be solved?
2:26
Expand
2:26 Sheara Wilensky:  
  • Web 2.0:More content on the web is being created dynamically (69%)
  • Over time, this hurts search
  • Developers are discouraged from building dynamic apps
  • Not solving AJAX crawlability holds back progress on the web
2:26 Sheara Wilensky:  Here is a diagram - a crawler's view ofthe web, with and without AJAX.
2:28
Expand
2:29 Sheara Wilensky:  So crawling and indexing AJAX is needed for users and developers.
2:29 Sheara Wilensky:  How do you know which AJAX states should be indexed? You want some way to opt in, and say this is a special state you want crawled. Obv cloaking will always be an issue. The larger the app, the harder it is to maintain.
2:30 Brian Ussery:  According to W3C "hash tags" (fragment identifiers, #anchors) point to the same source so engines are working on a way to please engines and change states.
2:30 Sheara Wilensky:  Now Kathrin of Google steps up:
2:31 Brian Ussery:  Solutions could be to let crawler run script but this is expensive, indexes would be old and stale not to mention only major engines could execute this level of crawling.
2:31 Sheara Wilensky:  Kathrin: So why don't the crawlers execute the JS on the web?
  • Very expensive to do, and time consuming
  • If all the crawlers were to execute all the time- and only the major search engines could attempt to do this- a few months down the road, the changes to your website won't show up bc the engines are busy with the JS and not updating their index.
2:32 Brian Ussery:  Another solution might be to allow servers to execute.
2:32 Sheara Wilensky:  Web Servers execute their own JS at crawl time to avoid the above problems, and gives more control to webmasters. It also can be done more automatically and not require ongoing maintenance.
2:33 Sheara Wilensky:  A diagram: Pretty URLs, vs ugly URLs
2:35 Sheara Wilensky:  The crawler will do its thing, find URLs, some will be pretty (with a hash fragment and an exclamation mark) so the crawler knows to index it. So it will map the pretty URL to the ugly URL.
2:35 Sheara Wilensky:  In the second step, it will go and request that URL from your web server, so it passes the hash fragment on to the web server.
2:35 Brian Ussery:  Adding "#!" instead of "#" alone will tell engines content is crawlable and should be indexed...
2:35
Expand
2:35 Sheara Wilensky:  In step three, the web server will reverse the mapping and recreate the pretty URL.
2:36 Sheara Wilensky:  In step 4, It will then invoke a headless browser, so you will get back an HTML snapshot.
2:37 Brian Ussery:  This is the headless browser concept a.k.a. "Ichabod"...
2:37 Sheara Wilensky:  The HTL snapshot will then go back to the crawler, which will index it, extract links from it, and keep going.
2:37 Sheara Wilensky:  Crawl time: Anytime the browsers update their indices.
2:37 Sheara Wilensky:  At search time, which is almost all the time, they will get a pretty URL, nothing changes.
2:38 Sheara Wilensky:  Web servers and webmasters will agree to opt into the scheme by indicating indexable states.
2:38 Sheara Wilensky:  The webservers will also agree to execute JS when they hit ugly URLs.
2:39 Sheara Wilensky:  The next point is that webservers will now agree not to cloak. For any of the search engines, if you give diff content to the crawler than you do to the user, you will risk elimination from the index.
2:39 Brian Ussery:  Google proposes a special token indicating content to be indexed by engines. The token is exclamation point #anchor.
2:40 Sheara Wilensky:  To summarize, we are going to go though the life of a URL process.
2:40 Sheara Wilensky:  We are proposing to the developer to tweak URLs to pretty URLs.
2:40
Expand
2:41 Sheara Wilensky:  So here is a summary. We are currently working on a proposal and prototype implementation.
2:41 Sheara Wilensky:  Please visit the Google webmaster central blog and help forum to discuss and leave feedback.
2:41 Sheara Wilensky:  Thank you!
2:43 Brian Ussery:  
Google's presentaion is available via:
http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
2:44 Brian Ussery:  Everyone seems to be in favor of making AJAX crawlable....
2:44 Sheara Wilensky:  Last up is Benj Arriola.
2:45 Sheara Wilensky:  Don't leave this session yet...
2:46 Sheara Wilensky:  Benj is speaking on CSS and Code positioning in SEO.
2:46 Sheara Wilensky:  I am going to talk about how to position your code in CSS.
2:46 Sheara Wilensky:  [very quick because almost out of time]
2:48 Sheara Wilensky:  An experiment done in 2006 talks about link text state, and 2 links going to the same page. SEOmoz also did a blog post on the order of 2 links going to the same place. Basically what they are saying is since the 1st link goes to "blog" and the second link goes to "celebrity news blog" only the 1st one counts.
2:48 Sheara Wilensky:  An experiment: We had 3 links using unknown words going to the same place.
2:49 Sheara Wilensky:  Note: this experiment was done in a controlled environment.
2:49 Sheara Wilensky:  When you search for the links, it was going to only where the link was located - not tracing where it was going to.
2:49 Sheara Wilensky:  The second link was also showing the link where the page was located.
2:50 Sheara Wilensky:  But the first link text was reading not only where the text was located,but where the link is going to.
2:50 Sheara Wilensky:  So in our opinion this was validating the experiments done by SEOmoz and the other guy.
2:51 Sheara Wilensky:  So what's the significance?
2:51 Sheara Wilensky:  A lot of ppl use footer links to improve SEO but they don't consider the order of the links.
2:52 Sheara Wilensky:  A good tool is firstlinkchecker.com to check the order of dupe links.
2:53 Sheara Wilensky:  So we have a left side bar and a right side bar, the main content goes first before the sidebar which is good because it's more kw optimized. So float left in CSS.
2:53 Sheara Wilensky:  You want the main content to come first before the sidebar bc it's more kw focused.
2:54
Expand
2:54 Sheara Wilensky:  What if you have 3 columns? Center come first, sidebars float to the left and right.
2:54 Sheara Wilensky:  What about yout top bar nav?
2:54 Sheara Wilensky:  BTW - if you do a float right, remember to do a text-align left.
2:54 [Comment From Michael Martinez]
Matt Cutts addressed the dual link issue in 2008:
2:54 [Comment From Michael Martinez]
http://www.linkspiel.com/2008/07/mattcutts-bat-phone/
2:56 Sheara Wilensky:  In summary, main content is important to come first in the code. Only the first anchor text is considered by search engines. Control the direction of floating
columns, and control top bar nav with absolute positioning.
2:56 Sheara Wilensky:  That's it, we are out of time!
2:58
 

 

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Search Engine Optimization

Video On Google Exploit With End Points Reveal Interesting Ranking Signals

Dec 18, 2024 - 7:51 am
Google Search Engine Optimization

Forbes Fires Freelancers Over Google's Site Reputation Abuse Policy

Dec 18, 2024 - 7:41 am
Google

Google Search Tests Rich Things To Do Image Carousel

Dec 18, 2024 - 7:31 am
Google

Google Search Shadow On Hover Of Search Results

Dec 18, 2024 - 7:21 am
Google Ads

Google Ads Tests Double Serving Ads From Same Advertiser On Same Page

Dec 18, 2024 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: December 17, 2024

Dec 17, 2024 - 10:00 am
Previous Story: Live: Analytics For Social Media