Month: September 2011

Search Engine Optimisation (SEO) with Ektron

I don’t claim to be the world’s greatest expert on Search Engine Optimisation (SEO) but over the years I have picked up enough methods for the sites I build to perform pretty well in search engines. There are many people who do nothing but SEO and there are many other people who know nothing or are badly misinformed – so this article is aimed at the latter.

Its easy to get the basics right with SEO.

The tips given here are aimed more towards how to implement SEO within Ektron’s CMS400.net platform.

The low-down on SEO

Search engines such as Google or Bing use a technology called “crawling” whereby they literally load a page on your site, scan the content of the page, then follow all of the links until they have done the whole site. As the search engine scans the page it ranks the page based upon what it finds; certain elements on the page are more important than others.

The elements in a page are ranked roughly like so (the following list may not be 100% correct, but it gives you an idea):

  1. Site domain name
  2. Page name (i.e. the path)
  3. Title tag
  4. Header tags (H1, H2, etc)
  5. Content within the body
  6. Metadata

Using that list as a guide you can see that having a meaningful domain name & page names are very important. However, all of these things must work together. If your page name is “bananas.html” but all of your content is about sparkplugs then the search engine won’t give it a good ranking because the message of the page is confused.

Make it relevant; relevant to the page not just to the site.

Use URL Aliasing

Aliases a.k.a. Friendly URLs are a great way to boost your SEO ranking. Ektron has a great aliasing engine built into the product. I always recommend using the Automatic Aliasing option as it saves time for the content authors; although the Manual option gives that something extra allowing authors a tighter level of control.

Automatic Aliasing can be set to use either your content folder structure or a taxonomy – both are great for boosting SEO – use whichever makes sense to your project.

The great thing about aliasing is that all internal links within the site will also pick up on the aliases, so when you build your menus all of the items within the menu will have the aliases in the HREF attribute – search engines will lap this up.

Set Page Titles

The TITLE tag within HTML should always be given a meaningful value that is descriptive of the text contained within. A common practice is to always include the name of the site in the TITLE tag, like:

“Bananas are my favourite fruit – Bananas.com”

I’m not a massive fan of doing this, I would rather concentrate on having a good title. The search engine (and indeed the end user) already know what site they are on. Search engines are very wary of sites trying to spam them to falsely improve their results. I would prefer to do things properly instead of trying to beat Uncle Google and Cousin Bing.

Use good HTML markup

Control of the HTML content will usually fall under the end clients’ control rather than yours. For standard HTML content items this is certainly true.

However, for SmartForm XML content items you can put in place a structure into which they client can add their content. XML content will be rendered on the page using an XSL file generally; this XSL file should use good markup. I’m talking about H1, H2, P, etc tags.

Try running some pages through an XHTML validator for tips on how to improve your markup.

Use Metadata

There was a time when Metadata was King. Sites would flood their Keyword and Description metadata fields with all sorts of keywords in the hope of boosting their search engine ranking. These days it is debatable whether metadata has any value at all.

Clients I work for always want metadata to be in place and I believe it is good practice to provide the metadata fields to them. I always stress to them not to waste too much time on metadata as the time is better spent elsewhere.

Page Structure

Another tip I came across a while ago was to try and keep all of your important content as close to the top of the page as you can. Things like Javascript should be kept towards the bottom of the page. This tip is based upon the idea that search engines only crawl up to a certain amount of content before they stop.

Lets say the page size is 100K and lets say the search engine only crawls the first 75K. The remaining 25K is discarded. If that 25k contained most of your body content then the page will not have been scanned properly.

Again I wouldn’t get too hung up on this one as I would be more concerned about the size of your page if a search engine wasn’t able to scan it all. Moreover, whatever the size limit is, I think it is large enough that you need not worry as long as you are already using good practices on building a website.

Page load speed

I can’t confirm whether this one is true or not, but it does make sense. Some people claim that if your site is running slowly that it will be penalised by the search engine. This is based on the assumption that search engines are also somehow determining whether your site is “good” or not.

Page load speed is important, but not for SEO, its important in general.