Following on from my blog “I want to do my own Search Engine Optimisation (SEO) where do I start?” this series covers each of the seven areas in a little more detail so that you can really boost your rankings online.
Having the correct website structure will ensure search engines can read all of your web pages and record the pages that you need indexed (listed) on the search engines. The following 6 key points cover some of the important parts of your website structure.
- Sitemap Web Page
A site map is a great way to list all or the main web pages on your website this acts as both a great way to direct browsers of your website to all the pages of your site and offers search engines the ability to follow the links on your sitemap to all of the pages you have listed. A site map is exactly what it sounds like, a map of your website that leads browsers and search engines to all the pages of your website.
- Sitemap XML file
An XML sitemap is very similar to the normal page site map but with one very significant difference. It is totally aimed at search engines. Search engines read XML sitemaps with ease and this allows them to index all the pages you want listed in an efficient process. Once you have your site map in place it’s a good idea to submit it to Google Webmaster Tools. It normally sits at the root level of your website but is not limited to this location.
<?xml version=’1.0′ encoding=’UTF-8′?>
Save the file with the filename sitemap.xml, now upload to your website root folder.
The robots file is a simple text file that Search Engines check when they come to index a website. It sits at the root level of your website and it is good for informing search engines of pages you do not wish to have listed, you can also have it instruct search engines if you do not want them to index your site at all, and also specify the interval time for which they will crawl (scan) your website. The robots file can also be used to reference your website XML sitemap. This also helps to quickly identify to search engines the location of your sitemap.
An example robots.txt file:
What Wikipedia says:
“The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable.”
Very similar to an XML sitemap but specific to Yahoo’s search engine and some what more simplified, although other search engines can also read this sitemap file.
Create a simple text based sitemap using notepad or other text editor as follows:
Create a new file in notepad. Enter your website addresses (URL’s) I’ve used my own site as an example below:
Save the file with the filename urllist.txt, now upload to your website root folder.
- CSS Files
Cascading Style Sheets are a styling language used to visually describe the presentation of a document, mostly used with HTML web pages or XML, they enable the separation of document content from document presentation (the visual look and feel). In terms of SEO the main recommendation I give is to load all CSS code in separate off page CSS files. This will speed up page load time and scanning of web pages.
Through use of well formed HTML pages CSS can enable the priority of your content above that of other less important factors for ranking well with SEO. More over, because Search Engines aren’t intelligent and simply read your pages from the top down, its best to prioritise the structure of web pages to ensure valuable content is scanned as priority.
Example provided by: John Biundo & Eric Enge
<h1>About our products</h1>
<p>Here’s our main content. It appears at the very top of the “body”
section of our file. The search engine will find this text easily
and weight it more importantly, for determining what the page is about,
than text it finds towards the bottom of the file, such as our
<li><a href=”http://www.mydomain.com/about”>About Us</a></li>
Stone Temple consulting, http://www.stonetemple.com/articles/css-and-seo.shtml
What you will notice about most of the above mentioned points is that each one should be stored as a separate file normally placed in the root of your website and will act as a sign post for the search engines to find out the right type of information (almost like sign posts) to rank your website with.
If you are about to design and create a new website or you have an existing website you would like to search engine optimise, then you need to ensure the mark-up (the HTML code behind your page) is valid.
- Ensure correct use of HTML elements with the correct document format
- Place styles in a separate Cascading Style Sheet (CSS), this helps to keep your HTML page primarily for content only.
- Use Flash Multimedia at minimums where possible. Flash should only used to help enhance the design of a web site page, an entire website should not be made up of flash as Search Engines cannot currently read flash files the way they do a normal HTML designed website.
So here we’ve tackled some of the more technical requirements of setting up your website to allow the Search Engines to get a better grasp of your web pages. Through the implementation of the points mentioned above you will set your website off on the right foot in terms of giving yourself a chance of great rankings by presenting to the search engines what they need, your content, and pushing all other website factors into external files.
If the points above seem a little confusing well it’s because they’re the website features not so commonly talked about. The key understanding that I want to give you is the awareness of what you should at least be factoring in to your SEO. So good luck.
Read the follow up to this blog: SEO 6 of 7: Links. – This is the must read!