A search engine-optimized website helps search engines understand what your web pages are about, and how to organize them. By this means, we can ensure that your website is included in search engine results quickly, and that your web pages rank as highly as possible for relevant search terms without further Search Engine Optimization work.

We apply no less than nine key steps in this process to every website we build:

1. Fast Page-Load Time

Search engines prefer web pages that load quickly. We will ensure that your web pages load as quickly as is possible, within the restraints of the website's design and content, by employing the following techniques:

  • Using Standards-Compliant Code
  • Optimizing all images for the web
  • Avoiding the use of browser plugins
  • Enabling compression to reduce the size of all files

You may also be interested in our optional Enhanced Page Speed Optimization Add-On.

2. Mobile-Friendly Website Design

Since April 2015, Google has given preference to websites that are optimized for use on smaller screens in searches carried out on mobile devices, with the extent of such a preference expected to increase. We will design your website to display optimally on all screen sizes using "Responsive Web Design", ensuring that your website will be given every possible advantage in search engine results.

3. Hierarchical Website Structure

Search engines strive to organize the internet, and we can help your website be indexed more thoroughly and more relevantly by organizing your website into an appropriate hierarchical structure. A basic example is that your website's blog entries, where applicable, would be located in a sub-folder of your website titled "blog", so all URLs related to your blog entries would begin as "www.yoursite.com/blog/". This helps search engines to organize and understand the information contained in your website.

4. Search Engine Friendly URLs

Another way to help search engines understand what your web pages are about is by having URLs that help describe the content of a page. For example, many software products designed to aid website building result in URLs like "www.yoursite.com/content.php?id=234", which contains no information about the web page's content. A similar page on a website built by us would have a URL like "www.yoursite.com/special-offers.htm".

The presence of ".htm" in URLs of our websites does not mean this is a static HTML page. Using special techniques, we can make even dynamic (database-driven) web pages have URLs like this.

5. Fully Indexable

Some websites have navigation menus that are constructed entirely using Adobe® Flash® or JavaScript. Such navigation menus mean that search engines won't be able to find pages linked to from such menus, since search engines can only find web pages linked to from HTML code. We will ensure that all of your web pages can be found by search engines, and where Adobe® Flash® or JavaScript is used, an alternative will be presented to those users for which this is not enabled (such as search engines).

6. Relevant Metadata and Header Tags

Among the most important methods used to inform search engines about your web pages' content is the use of relevant "meta tags" (such as the Title and Description tags),and H1 and H2 header tags. When adding content to your website via your Website Dashboard your website will automatically create these page attributes to optimize each and every web page of your website for search engine indexing.

Many search engines attach less and less importance to meta tags as each year passes, primarily due to the exploitation of this data to artificially enhance search engine rankings (Google announced that since September 2009 the keywords meta tag has been completely ignored). All websites we build include only "Search Engine Friendly" meta tags that conform to all current recommendations.

Where your website allows your visitor to directly share your pages using social media, we will also integrate additional meta tags such as "Open Graph" and "Twitter Card" meta tags, which make sure that Facebook and Twitter use the correct data for fields like title, summary and picture for your web pages when a link to one of your website's pages is shared on their networks. This means that you have full control over how your brand and website is represented on social networking sites.

7. Addition Of "Rel" Links

Where appropriate, your website will feature "Rel" Links in the head of your website pages' HTML specifying "Canonical" pages and pagination structures. For example, if your website contains a list of content divided into multiple pages, specifying the first page as "Canonical" and specifying each page in the set's previous and next page in the set will help search engines understand which page of the set to give priority to.

8. Dynamic XML Sitemap Creation

An XML sitemap is a list of all the pages of your website that you would like search engines to include in their search results, written in the "XML" coding language. Having an XML sitemap can result in more pages of your website being included in search engine results, and pages being included more quickly after being created. The location of your website's XML sitemap is specified in your website's "robots.txt" file, which is accessed by search engines to find out about your website. Your XML Sitemap will always accurately reflect the pages of your website, even if you add or remove pages via your Website Dashboard.

The location of the XML sitemap for all websites we build is at the following URL:

http://www.yourdomain.com/sitemap.xml (where "yourdomain.com" is the actual domain name your website uses)

If you use Google Search Console (formerly Google Webmaster Tools) to help administer your website, you can enter this URL when you add an XML sitemap.

9. Addition of "Robots" Directives

We will create a "robots.txt" file that instructs search engines if there are any pages of your website they should not include in their search results (for example, you may wish to exclude pages like a Privacy Policy). We will also add special meta tags to individual pages that you want excluded specifying such. The "robots.txt" file will also point search engines to the internet location of your website's XML Sitemap, facilitating the prompt and thorough inclusion of your website's page in their search results.