Technical Audits – 10 Points to Cover
Search Engine Optimisation

Technical Audits – 10 Points to Cover

18th July 2019

I have been working in SEO for over 10 years, working client side and agency as well as freelance. An audit of a website is one of the best places to start with a new client. It allows you to see the code, what should be working and what is not, and how you as an online marketer can help the client and add value.

There are many tools to choose from in the market to crawl your site such as:

  • Screaming Frog
  • DeepCrawl
  • OnCrawl
  • Searchmetrics

The tool you use will depend on the budget, both in terms of finances and time, and will help you identify the main issues with a website.  These are some of the key points to cover in your next audit:

1. Internal and Errors

With any new client or project I am working on, I always like to perform a crawl which will give a comprehensive overview of the issues. It is important to check the status of the URLs and ensure there are no 404 errors. If there are 404 errors then this needs to be resolved. As part of the internal and external errors, l look out for any 302s and check if there are 301s on the top priority pages. If a site has many 404 errors, there have been recommendations to redirect all to the home page, but it is more important to analyse those pages and fix the 404 errors. I like Yoast’s reference, that when parents ask a child to clean up a messy room, they put everything in the top drawer so that the room looks tidy. However, everything is in the wrong location so it really has to be sorted out later. I would stress to also analyse the external errors. You want to help your users, not send them to a broken external page. Your site does not look authoritative.

 

2. Alt tags

Another area where SEO consultants should be able to change are the alt tags. Alt tags stand for alternative text and the main purpose is to help visually impaired and blind users who use screen readers. Alt tags describe the image and how it relates to the text on the page. It is still important for SEO as search crawlers cannot see the images, therefore describe the image as specifically as possible, keep it short and use your keywords without keyword stuffing. Include the alt text along with the image title and description. Syed wrote a great article on Search Engine Watch about alt tags.

3. Robots.txt

The Robots.txt file is an area located on the root domain of a site (e.g. www.example.com/robots.txt) and is there to tell robots and spiders which URLs they should not crawl on your site. This helps save crawl budget. Rachel Costello wrote an in-depth post about robots.txt files in DeepCrawl’s knowledge centre. The robots.txt is a very sensitive area and one character in the file could potentially cause a lot of damage.

For example this means all bots can access everything:

User-agent: * 

Disallow:

However, if you add the trailing slash, then it means all robots do not have access:

User-agent: * 

Disallow: /

Therefore it is very important that you check the syntax in the robots.txt file and ensure the pages that should be crawled are able to be.

4. Sitemap

The sitemap in the robots.txt file is there to help the spider, show the pages that need to be crawled. Analyse the URLs in the sitemap to ensure there are the URLs to be indexed. Sometimes URLs that are 404ed or URLs that have been blocked in the robots.txt are in the sitemap. This wastes crawl budget and should be resolved. The sitemap needs to be up to date and free of errors.

The sitemap should be uploaded to Google Search Console.

5. Page Speed

Page Speed is one of the most important ranking factors. Today, there are many ways to check the speed of your site and its performance on mobile and desktop. I like to use Google page Insights and GTMetrix. Mobile accounts for half of the internet traffic and now with mobile first, your site will be negatively impacted on the slow loading time on a mobile device. The site should load in under 2 seconds. GTMetrix clearly shows you want needs to be changed as you can see below, this is a top level view.

GTMetrix

Google said that the “Speed Update” will only affect pages that deliver the slowest experience to users and will affect a small percentage of queries. However, Google have said that updates affect a small percentage of queries and thousands of sites have been affected.

6. Check the HTTPs

In 2014, Google started using HTTPS as a ranking signal. Therefore it is imperative that a site has all their pages on HTTPs. This means that all pages need to load over a secure connection which is not always the case. Canonicals, links and redirects should all be on HTTPs. Do a search of your website to check that only one version of your site is browseable. For example it should just be:

https://www.example.com or https://example.dom

After you have crawled your website, you will then be able to see all the pages that are indexable including if they are on HTTP and HTTPs.

7. Do a site search

Find out how many pages of your site have been indexed in Google

Open a clean browser and type in:

Site:https://wwww.example.com

The number of pages appearing in the search results should be the same number that are in Google Search Console. Review the URLS in Google Search Console on a regular basis to ensure all pages you want to be indexed are present. If when you conduct a site search you have no pages or far fewer than what is in Google Search Console, then there is something wrong as these pages have not been indexed.

8. Duplicate content

The results from any crawl will also highlight if there are duplicate content issues. Screaming Frog allows you to filter by meta data and see the pages that have duplicate meta titles and meta descriptions. You can also use Copyscape to see if your page has duplicate content issues.

9. Mobile Friendly

Is your site mobile friendly? There has been talk for years that this is the year of mobile but now with Mobile First it really is. Use the Google Mobile Friendly test to check your site.

 

 

10. Meta data

It is still important to optimise meta data, both the title tag and the meta description. It is not directly a ranking factor anymore but if you have a clear call to action in your meta description it will help customers to click through to your site. This results in higher click through rate which passes authority to Google which will help give your site more authority than others who have lower CTR. Meta data is one of the areas that anyone should be able to change on a site. This is edited in the CMS and can be amended by any SEO consultant and results can be seen in as little as a few days (as soon as your site has been crawled).

 

This post is not an exhaustive list but are some of the main points to consider when carrying out a technical audit. If you have any comments or questions, let me know.

Photo by Carlos Muza on Unsplash

Tags

Written By
Jo Juliana Turnbull is the organiser of Search London and the founder of SEO Jo Blogs, which provides practical advice and tips for those in SEO.
  • This field is for validation purposes and should be left unchanged.