10 Key Tips to Include in a Technical SEO Audit – For Non-Techies
Uncategorized

10 Key Tips to Include in a Technical SEO Audit – For Non-Techies

9th April 2013

In order to do a fully comprehensive SEO technical audit, it helps to have a developer background.  Many SEOs in the industry have such a background, which helps in being able to spot some of the more complex technical errors for a site.

However, for those that have come from a marketing background, it may be difficult to decide what the most important elements are to include in an initial technical SEO audit.  When working with clients it may not be possible to fix all of the errors on a client’s site, therefore for the non technical SEO, these are the key areas that should be included in the audit:

1) Site Errors

One of the first areas to look at when conducting a technical audit is Google Webmaster Tools, specifically the error reports.  Each site should be verified in Google Webmaster Tools and this is where many of the site errors can be easily identified.

GWT only highlights the errors in Google, so it is important to check the webmaster tools for Yahoo and Bing as well to capture all possible errors. Things be aware of are crawl errors such as 404 pages or DNS lookup errors.

For those who want a more comprehensive list of the site errors, using a tool like Screaming Frog to crawl the site and identify possible issues is extremely helpful.

2) Robots.txt file

The robots.txt file is used to prevent the search engine spiders from crawling certain pages.  When conducting the technical SEO audit,  it is important to check that the robots file has been used correctly.  Some robots.txt file may be restricting crawlers from accessing the entire site as the below example illustrates:

User-agent: *
Disallow: /

Google Webmaster tools will also highlight URLs that are being blocked.

If the client does not want a certain area of the website to be crawled or indexed, they can also indicate this in the header tags of the page’s HTML code:

<head>
<meta name=”robots” content=”noindex, nofollow”/>
</head>

3) Indexing

For those sites that have uploaded a sitemap to Google Webmaster Tools (see below), it is useful to check to see the number of URLs crawled.  It is also possible to verify the number of indexed pages by carrying out a site search query:

site:example.com

The number of pages that are returned (even though it can be quite an exaggerated number) should be around the same number of the URLs crawled.  If there is a big difference,  there needs to be further investigation.

Alternatively, there is a report in GWT called ‘index status’ which gives additional insight to how many pages on your site Google has included in its index.

4) URLs

URLs should be short, with no more than 115 characters, and should be static – i.e. one static URL for each page on the site in question. There are some websites with URLs that have excessive parameters, which is very common in e-commerce sites. For example:

www.site.com/products/object?cat=52&type=3&order=a

These parameters make it hard for the search engine spider to crawl and index the content, and can also create duplicate content issues. It does not present well for the user either.

Some sites have session IDs which result in the same problem as parameters.  A session ID is created when each user visiting a website is assigned a different session ID which is then included in the URL.

URLs should be descriptive and human-readable, and optimised to contain relevant keywords. If the page has many directory levels from the root directory, the search engines will interpret the page as less valuable.  There are many sites that have deep level pages for example:

www.site.com/country/city/categories/products instead of

www.site.com/categories/products

It may not be possible to change the structure of the website, but it is important to implement this new structure if a new website is being launched.

5) Identical Title Tags and Meta Descriptions

This may fall under a content audit, but looking at Google Webmaster Tools and also running a crawling tool such as Screaming Frog will easily identify duplicate meta titles and meta descriptions.  It is important to raise the duplicate meta data in the technical SEO audit which can be addressed more in-depth in the content audit.

All title tags should be unique, contain keywords (yes this is still important) and not be longer than 70 characters.  If it is longer than this, it will be cut off in search engine results.  If dealing with a large site, it is important to prioritise and optimise the key landing pages by hand; do not use a formula.  Meta descriptions are just as important and time should be spent encouraging users to click through to the page.  The description should be no more than 155 characters.

6) Canonical Tags

If the same content on a site is found on more than one URL – the same content is repeated on multiple pages with different URLs – a canonical tag should be used to indicate to search engines the preferred URL. This is called the canonical URL, i.e. the preferred page that you want the search engines to rank and credit.

For example if the site needs to be ranked www.site.com/mobile-phones/nokia.html, but the same content exists in different URLs (for example www.site.com/mobile-phones/brands/nokia.html and www.site.com/smartphones/nokia.html) the rel=canonical tag needs to be added to the section of these duplicate pages.

In the tech audit for the client, the canonical URLs and the solutions may need explaining to the client ,as it can be confusing.  The canonical URL must be included in the sitemap.

Google have said they cannot guarantee to follow the preferred canonical URL, therefore it is best to make sure there is no duplicate content on the site (if possible).

7) Sitemap

An XML sitemap is a list of all the pages of a site that are require the search engines to index.  The XML sitemap, which helps the search engine to find all the site’s pages, should be submitted to the Google Webmaster Tools account and its location should be referred to in the site’s robots.txt file. All pages that exist on the site that require indexing should be in the sitemap.  It is important that the sitemap adheres to the sitemap protocol.

As Google indicates in Webmaster Tools how many pages listed in the XML sitemap it has included in its index, XML sitemaps are a very important tool in to monitoring the index-levels of a site.

8) Redirects

Check the redirects for a client’s site, as they may be using 302 temporary redirects instead of 301 permanent redirects. A 302 temporary redirect does not pass the “link juice” – the page’s search engine value and authority – to the new page as it tells the search engine it is a temporary redirect. A Screaming Frog crawl will highlight which redirects are used on the site.

I have been using a redirect checker from Ayima which is really useful for checking a few individual pages.  If the site has a lot of redirects, it is important to address the issues as soon as possible.  They should not be removed, as there may be many links to that page.  If these are removed, a new problem may occur: many 404 pages.

9) Cached Sites and View like a Spider

It is also important to view the cached version of the page to check when the site was last indexed.  If it has not been indexed for a while, it may mean there are crawling issues with the site or the site has suffered a penalty.

If this query is entered into a search engine: ‘cache:www.example.com’ it is possible to see the date when the site was last indexed.

In reference to indexing, it is critical to check what the site looks like to a search engine spider.  One way to do this is use SEO Browser.  For sites that may look good (in flash), this tool quickly demonstrates how the site looks to a search engine.  Flash is difficult to be crawled and indexed, which means that it needs to be seen from a SEO perspective.

10) Page Speed

The performance of the site is the final test.  Users are impatient, as are search engine crawlers.  If it takes too long to load the site, then engine crawlers and users will leave.  If the site loads quickly, then it will be crawled more thoroughly and therefore the pages will have a better chance of being indexed and ranked in the SERPs.  There are many tools to use to check the speed of your site and good old Google have their own Page Speed Checker. There is also a Site Speed report in Google Analytics.

There are many ways to carry out a technical audit. This is by no means the exhaustive list, but it should help those especially from a non-techie background to identify some of the most important elements to include in an audit.  If there are any more tips that can be shared and which have not been included in this post, please leave a comment below.

Tags

Written By
Jo Juliana Turnbull is the organiser of Search London and the founder of SEO Jo Blogs, which provides practical advice and tips for those in SEO.
  • This field is for validation purposes and should be left unchanged.