10 SEO-Friendly Website Elements that Developers Often Miss

When it comes to building websites, developers rarely ever miss out on the onsite aspects of SEO. Things like title tags, headers, meta tags, semantic URLs, body texts and other elements visible to the eye all play a significant role in improving a website's search rank and all often come as standard. It's relatively easy to edit them without a second thought. Still, there are some aspects that impact a website's SEO and tend to be forgotten by web developers rushing to deliver their projects on time.

Some of those elements belong to the technical side, others are related to analytics and website structure, but all are equally essential in building a website, which will rank high in search engine results.

Here are the top ten SEO-friendly website elements that you might be missing in your project.

1. JavaScript

We all love JavaScript – it's functional and can add a lot to the overall user experience of a website. When building your site, bear in mind that search engines struggle to read JavaScript code. That's not to say that ihe situation will never change – in fact, search engines are getting better at understanding JavaScript and executing it so they can access more content.

To avoid any JavaScript-related SEO failure, remember these two key points. First, don't place valuable content within JavaScript code – otherwise you risk that search engines won't be able to access and index it. Instead of JavaScript, try Jquery – it's just as functional and, moreover, SEO-friendly.

Second, don't use JavaScript to hide content from search engines. What used to be a practice of the past isn't always applicable to the present – just because a few years ago search engines had no way to see the content located in JavaScript doesn't mean they will never be able to see it.

2. IP Detection and Redirection

This is a great functionality for providing location-based content – a website will detect the user's geographical location and redirect them to a more relevant website or content. IP detection works great for users, but not for search engine crawlers. A crawler will usually crawl from a US-based IP and so it won't be able to access all of your content, such as versions of your website in different languages.

When applying IP detection and redirection to your website, you need to weigh your options carefully. While users generally enjoy it, search engines might simply find a limited number of pages to index and never get to see the entire content – all of which will negatively impact the search rank of your website.

3. 301 and 302 Redirects

A lot of web developers never consider the importance of 301 redirect, even if it can really impact a website's SEO. If you're up for migration, always make sure that all pages from the old version of your website redirect to the new one.

If you're still actively working on your website, it's more than probable that you'll be removing or changing page URLs – make sure that your new CMS creates automatic 301 redirects. This way, you won't lose on any link juice or equity. 302 redirect is reserved for pages that are just temporarily unavailable – that essentially means that search engines won't pass link juice across the redirect and won't remove the old URL from their index.

Now, this is all fair and sound, until developers mix up the two and really hurt their SEO. The common cause of the mistake is that, redirected to a different page, users won't notice any difference. Only search engines will. If you move your entire site into a new domain and use 302 redirects, you'll lose on link juice being passed and search engines will retain pages indexed from your old and new domain. What a mess!

4. XML Sitemaps

Sitemaps are really easy to set up and cost nothing, but if a website is under constant development, web developers might easily get discouraged by the constant need to update the sitemaps to help crawlers index their website.

One thing is clear – for large websites, sitemaps are an absolute must. How to deal with a dynamic or constantly altered website? Instead of updating the sitemaps manually, use CMS to do it automatically whenever the website changes. Some packages do it natively, others need a plugin, but all should include this functionality.

5. Site Speed

Countless studies demonstrated the effect of a slow load time on user experience and a website's conversion rates, so you're probably aware of this issue. Generally, site speed is not only great for users, but search engines as well – load time is today a ranking factor, especially for Google.

Site speed almost never depends on the code itself – it has more to do with HTTP requests optimization, external file compression, using domains without cookies or loading JavaScript asynchronously.

To check the speed of your website use this simple tool (http://www.webpagetest.org/), or have a look at Google's Page Speed toolkit (http://code.google.com/speed/page-speed/). The latter will not only measure your site speed, but also recommend possible improvements.

6. Canonicalization

When it comes to SEO, duplicate content is nothing short of your enemy. Once spotted by Google, notices about duplicate content will annoy you to no end. That's why you need to take care of the canonicalization of your URLs at the very beginning of your work.

Whether it's www or non-www, SEO won't care as long as you stick to one of them. Another is making sure that your CMS or other platforms aren't creating multiple URLs for each page or adding extraneous data to your URLs. There are lots of tools on the web to help you check the number of pages on your website (like Xenu's Link Sleuth). (http://home.snafu.de/tilman/xenulink.html)

Once you're done with that, implement the canonical tag, which basically tells search engines what the definitive URL of a page should look like. Even if you're not suffering from a surplus of URLs, rel=canonical will guard you against potential problems coming from links with extra URL data on them. Plus, it's really easy to implement, so you won't lose time doing it – you can only gain!

7. Analytics

SEO and web marketing depend on good analytics. It's surprising how many people forget about analytics before launching a website. Both large and small websites should be constantly monitored – analytics will give you a whole wealth of information on how your website is being found, accessed and used.

8. Crawler Access

This is often overlooked by web developers. Crawler access and crawler allowance optimization are both based on an in-depth understanding of they ways in which search engines index websites. Even though it's a true tech giant, Google has limited resources and won't be able to crawl and index each and every page of your website. That's why you need to make sure that whatever it finds is worth its trouble.

Don't lose on crawler allowance for pages that are insignificant and make sure Google and other search engines spend more time on those, which are really important. Clear site architecture with defined page rank will already help a lot.

As a web developer, you can't be expected to deliver a website and then have an SEO expert suggest you start building your architecture from scratch. What you can do is use Robots.txt – used to mark pages and specific URLs search engines aren't permitted to crawl.

9. HTTP Header Codes

All web developers know those header codes, but not all of them realize the impact of those on a website's SEO. For a comprehensive view into the meaning of various header codes, have a look at this infographic prepared by SEOgadget. (http://sixrevisions.com/infographs/server-headers-101/)

We've already covered 301 and 302, so it's time to move on. 404 Page Not Found basically means that the requested page cannot be loaded. Effect? After a while, search engines stop ranking it. Always make sure that a 404 page returns a 404 header – this is an error that can negatively impact your SEO efforts.

10. Flash

This is basic, but still worth a mention. Just like Javascript, search engines find it hard to understand Flash and despite several improvements, they are still in the dark – more so than in the case of JavaScript. Google is able to extract some text and URLs from Flash, but it's probably the only one out there. Avoid using Flash as something more than a tool for enhancing your website.

Even though web developers aren't supposed to be SEO experts, there's a fair share of knowledge out there to help them gain new insights on how to help their colleagues in ensuring a website's high rank in results produced by search engines. Judging from the above, you can see how potential ways for improving a website's SEO lurk in each and every corner of its structure.