In Part II, we looked into seven more Onsite SEO tactics to help improve your Search Engine Rankings. In Part III we shall examine some more advanced tactics, so if you’re not a Web Designer or this seems overly technical, just ask us for assistance via our Contact Form.
This article assumes you’ve read Understanding Search Engines for a basic overview of how Search Engines work.
Now, here are six more methods for improving Onsite SEO:
-
Submit your website to Search Engines
All three major search engines Google, Bing and Yahoo have online forms that allow you to submit your website’s main URL to initiate crawling, although Yahoo has recently merged with Bing so the submission process is now different. Bear in mind this submission only takes the Search Engine spider to the homepage so if you want it to crawl (i.e. scan for keywords) other pages within the website, the robot must either:
- Follow links found internally on the website, or
- Follow links appearing on external websites (i.e., other websites). Unless a website consists of a few pages it is unlikely all the webpages can be found through external links. Rather, Search Engine robots rely on the website itself to locate content within the site. This means a website must contain an internal linking system to guide Search Engine robots as they indexes your site.
-
Build an Intuitive Website Navigation
A website should be designed so it is easy for both users and Search Engine robots to navigate the site and access all material contained within the site. Usually navigating a website is handled via a menu system that includes links to the important content areas. Some websites, particularly those with a small number of pages, provide access to nearly all internal webpages through a single main menu. Websites with many content areas often follow a hierarchical design where the main menu contains links to important sub-sections and then present users with menu links to further sub-areas. In some cases sub-areas may contain yet more navigation menus.
Before a Search Engine can determine whether a webpage satisfies a search enquiry, the Search Engine must be able to locate the content. As mentioned in the article Understanding Search Engines, Search Engines use robots (i.e. an automated computer program) called “spiders” and “crawlers” to search the Internet for websites.
These robots locate and follow links within a site to find its webpages. As the robots use the website’s navigation for this purpose it is imperative to have a logically structured navigation system. Remember, if a robot cannot find a webpage then your users won’t be able to either! Once a suitable webpage is found, the robot locates the content, crawls it for keywords and indexes the finds in databases so it can be available in user’s search queries.
If your website only has a few pages it is simple to build a menu system which incorporates each webpage, however, if your website has many pages it may be difficult to accommodate them all in a single menu. Therefore it may be necessary to use a dynamic menu, like a dropdown menu to list webpages in a particular section of the website.
Such functionality often requires more advanced programming methods like Javascript or Flash. This however may make it difficult for Search Engines to determine the webpage structure, especially if using Flash. To overcome this it may be necessary to implement a sub-menu into each website section, rather than have one main menu which encompasses everything.
-
Create a XML Site Map
A Sitemap is basically a navigation tree depicting a website’s structure, and help to complement the Search Engine Crawlers used by Search Engines to discover new web pages. They come in two types, XML and HTML. XML Sitemaps are for the benefit Search Engines while HTML Sitemaps benefit the users.
XML Sitemaps allow you to include additional information about each URL or webpage on your site, like when the page was last updated, how often the page changes, and how important the page is in relation to other webpages on the website. This enables Search Engines to crawl your website more intelligently by only crawling pages that need to be refreshed or added to their database. This minimizes the traffic on Search Engine servers and yours, which means using less data bandwidth, lowering costs for both you and them.
By implementing a XML sitemap in conjunction with a robots.txt file, Search Engines spiders are able to find more of your webpages and minimize the resources used to index your pages.
However, an XML Site Map does not guarantee a website or webpage will be indexed by Search Engines. It’s more important to have an intuitive navigation structure that enables the Search Engines and user to access every page on your website.
-
Create a Robots.txt
When a Search Engine crawler visits to your website, it will search for a file called robots.txt. This file instructs the Search Engine which webpages should be indexed and the pages that should be ignored.
Robots.txt is a simple text file which must be placed in your root directory, for example: https://www.yourwebsite.com/robots.txt
The content of a robots.txt file consists of "records" which contain the information relevant to Search Engines. Each record consists of two fields: the user agent line and one or more Disallow lines.
For example:User-agent: googlebot
Disallow: /cgi-bin
This robots.txt file allows the robot to retrieve every webpage from your site except for pages in the "cgi-bin" directory, meaning all files in the "cgi-bin" directory will be ignored.
The SEO benefits are robot.txt informs search engine spiders to skip pages which are considered not important enough to be indexed. This provides a better opportunity to have your website’s most valuable pages featured in the Search Engine results pages. So essentially, robots.txt is a simple method of informing spiders to return your most relevant pages in search engine results.
-
-
Use Redirections to Eliminate Broken Links
A redirection is process of forwarding one URL to a different URL. It is a method of sending both users and Search Engines to a different URL than the one originally requested.
The main reason behind using redirects is to prevent Search Engines and users from accessing broken URLs. Imagine having a webpage full of important information but the URL stored in the Search Engine’s index was broken – people would be taken to the broken URL and receive a “404 Webpage not found error”. This could be disastrous if a webpage was vital to one of your marketing campaigns!
Fortunately Search Engines don’t rank webpages with 404 errors highly but you’re still losing out by not informing them when one of your important webpages has moved. Also, it’s important to consider Search Engines don’t update their indices the exact moment your webpage changes location, so it is possible to have a highly ranking 404 error webpage, till such a time the Search Engine updates its index.
There are different kinds of online redirects:
- 301 Moved Permanently
A 301 Redirect is a permanent redirect which passes between 90-99% of the ranking power to the redirected page. “301” refers to the HTTP status code for this type of redirect. In most cases, the 301 redirect is the best method for implementing redirects on a website. - 302 Found (HTTP 1.1) / Moved Temporarily (HTTP 1.0)
A 302 Redirect is a temporary redirect and passes 0% of the ranking power and in most cases should not be used. The Internet runs on a protocol called HyperText Transfer Protocol (HTTP) which dictates how URLs work. It has two major versions, 1.0 and 1.1. In the first version 302 referred to the status code ‘Moved Temporarily’. This was changed in version 1.1 to mean ‘Found’. - 307 Moved Temporarily (HTTP 1.1 Only)
A 307 Redirect is the HTTP 1.1 successor of the 302 redirect. While the major crawlers treat it like a 302 in some cases, it is best to use a 301 for almost all cases. The exception to this is when content is only temporarily moved (like during maintenance) AND the server has already been identified by the search engines as 1.1 compatible. Since determining if search engines have identified this essentially impossible, it is better to use a 302 redirect. - Meta Refresh
Meta refreshes are a type of redirect that is executed on the page level rather than the server level meaning they are slower and not a suitable SEO technique. They are most commonly associated with a 5 or 10 second countdown accompanied by the text "If you are not redirected in 5 seconds, click here".
Overall, the 301 redirect is the most preferred as it indicates to both browsers and search engine bots the page has moved permanently. Search Engines interpret this to mean not only has the webpage changed location, but the content, or an updated version of it, can be found at the new URL. The Search Engines will carry any link weighting from the original webpage to the new URL.
- 301 Moved Permanently
-
Create a Custom 404 “Page not found” webpage
The 404 error "Page not found" is the webpage displayed when a user requests a webpage no longer available on your website. The reason is usually a link on your site was incorrect or the webpage may have moved or deleted. Because there is no web page to display, the web server instead sends a page saying "404 Page not found".
As mentioned in the previous point, you can use redirects to reduce the number of 404 errors, although this may not be possible in all cases. What you can do is create a custom 404 page with links to your most important content webpages and even include a search box to enable users to search for what they’re looking for. The search box won’t be of much uses to Search Engines but at least it gives users an incentive to keep using your website.
The webpage links however, are an opportunity to rewrite webpage titles in a different way than they appear on webpages. For example, say this webpage is titled “Customising a 404 Error Page for SEO Benefits”. You can instead suggest it as “Using 404 Error Pages to Your SEO Advantage”. So it provides all new keywords and key phrases for the search engine’s index.
That brings us to the end of our Onsite Search Engine Optimization tactics. After reading Part I and Part II you will have 18 different methods to help boost your Search Engines rankings!
However, on-site SEO Techniques are only one side of the equation – there are also Off-Site SEO strategies which are equally as important. So checkout our article on Off-Site Search Engine Optimization – Part I for more great idea!
Don’t forget FREE VIPs are eligible for a FREE Website Improvement Report. In each report we analyze your website and explain how its appearance, usability, content and SEO can be improved so it can reach its fullest potential.
If you need help with improving the Search Engine Optimization of your website, have any questions, or would like a FREE, no obligation consultation, don’t hesitate to contact us through the form below.