SEO to the limit

May 9, 2004 - 12:46 pm 0 by
Filed Under Search Theory

Most professional search engine optimizers know how to obtain high rankings in Google using tricks that break Googles inclusion guidelines. Testing methods is important and �throwaway� domains are often used for testing purposes. If a professional search engine optimizer does not know how far he/she can go before occurring a penalty, they are not going to be able to optimize a website to the maximum possible. There are terms like �black hat seo� that you see being referred to on various forums and articles which describe search engine spamming methodology. It is not however a clear black and white (hat) issue and I personally don�t like the term black or white hat as to often the definitions are no where near clear cut or even agreed upon amongst the SEO community.

There are many legitimate uses for much of the SEO methodology listed as �no-nos� on the Google guidelines. http://www.google.com/webmasters/guidelines.html This is probably why Google calls them guidelines and not conditions for inclusion.

There are many so called gray methods of optimization which many webmasters, having read the Google guidelines, automatically discount trying for fear of penalization. Of course, most bending and out-and-out breaking of the Google guidelines are carried out with the sole purpose to manipulate Google, often in a crass manner such as literally thousands of doorway pages that automatically redirect or hiding text through one of the many different methods.

There are some forms of cloaking (providing the Google spider with different highly optimized content than a human with a browser would see) that Google would & do encourage (if it wasn�t so open to abuse). Other methods involve manipulation of HTML tags with css, javascript redirection, url rewriting, creating near identical pages, using css layers etc. I have used most of the above on MY OWN website and do not fear penalization. Why am I not worried I might be penalized by Google? The reason is because my content is exactly the same for a human visitor as it would be for a search engine spider and no attempt to dupe Google through hidden content or redirection takes place. Also, there may be a legitimate design reason for example. I actually HELP Google by providing new pages for indexation and make my own (and clients) sites much more spiderable/indexable.

One search engines optimizers 'advanced techniques', is another's spam. There is unlikely to ever be a full consensus on what is or is not an acceptable search engine optimization technique. What you need to do is simply to ask yourself, am I trying to dupe Google here? If a competitor reported my page for spamming could I sleep at night in the knowledge that the sit would not be penalized? The important part is intent. Are you helping the search engines or manipulating them?

Below are some examples which would make some webmasters cringe, yet are at least in my opinion and research, perfectly acceptable.

1. Removing session Id�s from online shops / forums (in my own case) and other dynamic websites. This could be considered a form of cloaking as it often involves a referrer check (user_agent/IP). If it is Googlebot, drop the session id. If it is a human with a browser, generate a session id. Session ids are a definite way to make your site invisible on the search engine results pages. Word still hasn�t got round yet to some major corporation webmasters about the real damage session ids can do to a ranking. �We need to know the full click paths of our visitors� is a common statement. �You aren�t going to get many visitors if you don�t do something about the session ids� is my usual reply. A little programming could solve the majority of problems online shops for example have at getting deep crawled and indexed. Google has officially said it regards the removal of session ids as valid search engine optimization.

2. URL rewriting through php/asp programming can also be used in the removal of session ids or flattening urls (removing multiple parameters form a url �?�, �&� etc.). With php you can rewrite the url to hide a session id. This you could say provides Google with a url which is not the real one. They will be happy though if you have a quality site which now makes it possible or their spider to crawl and index it!

3. Re-definition of heading tags (H1 etc) through the use of CSS. The H1-H3 tags are commonly either too big or just do not fit in with your design or look and feel. It is perfectly acceptable to redefine the size of these tags from their default through CSS. It is known that heading tags (h1-h3) can help strengthen a pages relevance in the eyes of a search engine and therefore help its ranking. Not as much as a few months ago arguably, but still a good thing to have. An h1 tag in its default state is an eye-sore. With css you can make it fit your sites look and feel and resize it, colour it, underline it etc. What you most definitely do not want to do is make it invisible, use it where you wouldn�t normally use a heading or make it tiny for example. An example code for a redefined H1 tag may look something like�

H1 { FONT-WEIGHT: FONT-SIZE: medium; COLOR: #990000; FONT-FAMILY: Verdana, Arial, Helvetica, sans-serif }

4. There are also legitimate uses for JavaScript redirection. One example is calling framesets for framed pages that end up being a landing page due to a click on a search engine result. Without the framesets they often lack navigation and are of course not seen as they should be in the context of other frames. A simple JavaScript redirection can solve the problem of framed pages becoming landing pages without a frameset. An example of a very basic redirection script which solves this problem is:

<SCRIPT language=Javascript>
<!--
if(top.frames.length <= 0)
top.location = "http://www.your-frameset-page.com";
//-->
</SCRIPT>

The above code on all framed pages solves the problem of pages being loaded without their framesets. The above is the quick and dirty way to do it. Of course the best solution is to drop framesets altogether but that�s not for this article. There are several ways to skin this particular cat though. Netmechanic.com has some good info on more advanced ways to do the above.

5. Sometimes for technical reasons, or more commonly, a webmaster lacks the programming know how, dynamic pages can not be spidered by search engine robots and therefore there needs to be a way to get these dynamic content pages spidered and indexed. Multiple doorway pages that automatically redirect is a way many would go about solving this problem. Many a website has flown out of Google because of just this practice! Do not be tempted.

So what can you do that provides a legitimate reason to create static indexable content that is very similar to your dynamic (non-indexable) pages? How about creating a �printer version� of your pages? Some dynamic forums such as vBulletin do this automatically through archiving threads. There is also software that can create static html pages from dynamic ones, but it can be costly. One way is simply to save your most important pages using save-as in your browser (not the whole website, just the page.) Make the page printer friendly by say removing colourful backgrounds and changing to printer friendly fonts and graphics. There should be no issues of content duplication as of course the dynamic pages for whatever reason couldn�t be indexed. This method is clearly open to abuse and you should never have automatic redirection and should be used only if your dynamic pages are completely unable to be spidered or indexed. It goes without saying that the content should be the same as the dynamic page.

6. The use of the apache mod_rewrite module, or for Windows servers, ISAPI filters to cut down the number of parameters on dynamic urls (Google has problems over 3 parameters and deepcrawling with long urls full of parameters (�?� and �&�) is not cloaking and is perfectly acceptable. I mention this as many think flattening out urls through server side technology must be some form of spam. I use it on my own forum and it works very well. The user sees the same url as a search engine spider and therefore it is not a form of cloaking. All I am doing is helping Google index my pages.

7. Text links in the footer area (at the bottom of visible page) as alternative navigation to a traditional top left or top horizontal navigation not only can help spiders find internal pages, but it is also an excellent workaround for main navigation which uses say imagemaps, java or javascript menus, the links of which would not otherwise for the most part be able to be followed. It is also good design practice to have navigation below as well as above the �fold�. Don�t feel you are spamming just because you add additional text links at the footer of your page.

The above represents seven forms of legitimate optimization which works and does not involve spamming the search engines. They should be used in moderation and whatever you are going to do, do not hide content, attempt to dupe the search engines through redirection where there exists no valid reason to do so. Some of the above methods can also be used to spam search engines. Believe me, it is not worth the risk. If you overdo any of the above and your site gets removed do not point the finger at this article or myself. I use the above to help the search engines, not to abuse them!

If you are unsure if what you are doing is over the top or not, the likelihood is it is. Always err on the side of caution but also don�t be afraid to use the tips above in moderation. Common sense is really is really the best way to evaluate whether or not you are abusing or helping a search engine.

Alan Webb ABAKUS Internet Marketing

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 22, 2024

Nov 22, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Heated, Site Reputation Abuse Expands, Site Wide Search Signals, DOJ On Chrome, AI Overview Ads & More

Nov 22, 2024 - 8:01 am
Google Search Engine Optimization

Google Search Console Indexing Reports Lagging By 7 Days

Nov 22, 2024 - 7:55 am
Google

Google Things To Know Tests Side By Side Results

Nov 22, 2024 - 7:51 am
Google Search Engine Optimization

Google On Too Many Network Requests & SEO

Nov 22, 2024 - 7:41 am
Google

Google's People Also Search In Images

Nov 22, 2024 - 7:31 am
Previous Story: SEO Challenge on nigritude ultramarine