For a successful Search Engine Optimization
strategy, take into consideration that search engines look at content
and also at the structure of the markup. They emphasize the importance
of text content, page titles, keywords rich text, meta descriptions and
information architecture. A website where quality of content and code
prevails will rank higher in the major search engines.
There
are many Search Engine Optimization tactics, but try to find the best
combination and don't sacrifice the usability and performance of your
website. Here is some basic information about improving your source
code from an SEO perspective:
--Avoid classical 404 error pages
The 404 - File Not Found - page is presented to the
user by the server as an error page. The user gets this message
directly from the server of the website he is trying to visit. This
error page is supposed to appear only when the server cannot find the
requested location and is unsure of its status.
In
the vast majority of cases, the 404 error emerges for pages that were
moved or even deleted or the layout of the site or page information
changed.
Many
hosting companies offer a 404 redirect page. This means that when a
user enters the URL of any page of your domain, and that page does not
exist or can no longer be found, you can automatically redirect the
user to a specified page - usually your home page or your sitemap.
Pay
special attention when you decide to delete certain pages. Remember to
redirect them to a main page of your site using the 301 (Moved
Permanently) HTTP response code.
You can greatly improve the user friendliness of your website by creating a custom 404 page.
-Present a message of apology for the inconvenience;
-Try
to ease the user's way back to your site. Introduce error messages and
include evident links to the home page, sitemap, and contact page;
-Offer
assistance and encourage the user to continue to search for the
information he needs on your site. You could even include a search box
right on the error page;
-Keep the same design for the error page as for the rest of the website.
See: www.avangate.com/4040404404
But
remember that the best strategy for a 404 error page is to prevent it
from coming up altogether, as many customers might be left with the
impression that the whole website does not exist and not just the
specific page they were trying to access.
--Keep away from orphan pages
An orphan page is a page that is not linked to
another one an thus cannot be found by spiders. To avoid having orphan
pages on your website, check regularly that all your pages are linked
to each other.
Search
engines consider sites with orphan pages to be unprofessional, and not
worthy of getting a high rank. This kind of website is under
construction or is the result of a poor design process. If your pages
aren't linked properly search engines won't index them and will
consider them irrelevant for the search.
Pay
attention to the fact that some search engines don't index correctly
websites that use HTML frames. When spiders crawl through your internal
pages, they index each individual page and display them as orphan pages
in search results. Most frame designs include a content frame and a
navigation frame.
Visitors
require both frames to navigate through the site. Create a JavaScript
to check if the page is loading correctly, and load the frameset. In
this way, users won't be able to open pages outside the frame. This is
a very easy way of losing clients.
--Use 301 redirect pages
To avoid displaying a 404 error page, set up a 301
redirect page. The code 301 means "moved permanently" and it's the
easiest way to preserve your search engine rankings for that page.
There
are two ways of generating proper 301 redirect pages. If your site is
hosted on a Linux or Unix server create a .htaccess file to add the
redirect to your server's web root.
The
.htaccess file contains specific instructions for certain requests,
including security, redirection issues and how to handle certain
errors. If it is hosted on a Windows or IIS server then the 301
redirect can be set up in the Administrator's section of the server
software or through the DNS (Domain Name Server) zone.
Make
a habit of reviewing the log files which contain data sent by your
server. Search engine spiders often make critical decisions based on
what your server tells them through the server's headers.
Pay
attention, an improper 301 redirect can cause you big problems, since
your website might fail and users won't be able to visit your pages.
Setting up a correct "301 redirect" assures that you'll stay high in
search rankings.
--Create a sitemap
A sitemap is a web page that lists all the pages on
your website. They are intended both for users - to find easier the
information they need , and for search engines to index pages.
Your
sitemap link should be right on your home page. In this way spiders are
sent directly to the place where all your content information is
gathered. Sitemaps can improve SEO, however, be advised that they only
take into consideration a limited number of links to those pages.
To
make sure that spiders check your whole site and have more chances to
get indexed, it would be a good idea to use a sitemap generator. You
can use ROR sitemaps that are readable by all search engines.
ROR is a sitemap tool that uses XML feeds to describe your website.
ROR
sitemaps allow search engines to match text search with structured
information, thus obtaining more relevance for your site. This kind of
sitemap helps search engines to better understand your website content
- products, services, images, articles, etc. By creating a file with
product names, descriptions, prices, images, availability, affiliate
programs, and any other relevant information customers can find you
easier.
--Don't overuse dynamic pages
Dynamic web pages include dynamic content - images,
text, etc - which changes without the page being reloaded. Client-side
languages like JavaScript and ActiveX are usually used to create these
types of web pages.
Search
engines don't rank well dynamic pages with many parameters. If you
choose not to turn your dynamic URLs into static ones, at least put the
most important parameters in your URLs first and try to limit dynamic
parameters to no more than two.
Spiders
can't read the text rendered as graphics. Any text that you want the
spiders to read and index should be written out as text. At the very
least, put any text that appears in graphics into the images' ALT
attribute.
A
slightly better alternative is to write your text in Flash, but
remember to have a "Skip this intro..." link that takes visitors (and
spiders) to the text-rich content of your site. Don't neglect this
information if you want to optimize your search engine strategy.
--Put .CSS and JavaScript into external files
For a search engine, improperly formatted code will
have a negative impact on your rankings. Since search engines read only
a certain amount of information on a web page, you should try to
increase the text content to HTML tag ratio.
If
you have too much HTML code, the text content won't be seen entirely.
For reducing HTML code, utilize hand coding using external .css files
and Javascript.
--Make sure you have well formatted [X]HTML
Try to fix as many of the HTML errors as possible.
Although the search engines don't rank better websites that have
standard compliant code they tend to "read" them easier. Use the W3
HTML Validator to check the validity of your code.
Resource Box
Read Avangate.com/articles/Effective-SEO-through-good-code-structure article.
Copyright © 2006, www.avangate.com all rights reserved. This article was written by a Web Marketing Specialist at Avangate B.V. Avangate is an eCommerce platform for electronic software distribution incorporating an easy to use and secure online payment system plus additional marketing and sales tools.
This article may be reproduced in a website, e-zine, CD-ROM, book, magazine, etc. so long as the above information is included in full, including the link back to this website.