Subscribe to our newsletter

Get the State of Digital Newsletter
Join an elite group of marketers receiving the best content in their mailbox
Help us understand what topics we should be writing about!
We would like to help you get the best content for your role
* = required field
I want the...

What alerts do you want to receive?

What topics do you most like to read about?

4 Technical SEO Issues That Often Go Unnoticed

Focusing on great content for your website, but failing on technical SEO is like putting Fernando Alonso in the 2015 McClaren F1 car. You have a great asset, but are being held back by technical issues!

In this post, I discuss four technical SEO issues that go unnoticed by most companies.

Redirect Chains

world-chainRedirects are part and parcel of having an evolving website. You want to ensure that both search engines and users do not have a bad experience and therefore you add in redirects to the most relevant page, and quite right too.

But what occurs more than some people realise, is the page that you are redirecting has already been redirected, thus causing a redirect chain. This is common within both eCommerce and editorial content, but can be solved relatively easily.

The problem you have is you are potentially losing any link authority that you may have gained from pages you redirected two or three iterations ago. I appreciate Matt Cutts has said all link value is passed through redirects, but I am a big believer that the more redirects they go through the more value is lost.

To see if you have any redirect chains on your website, all you need to do is fire up Screaming Frog and run a crawl. On completion of the crawl, go to the menu and select reports > redirect chains.

This will provide you with an XLS of all the redirects and redirect chains that are currently live on the website. The next step will be to start cleaning these up. I have seen some good gains in traffic by changing a redirect chain into a one-to-one redirect.

Layered Navigation

I come across this issue ALL of the time, yet nobody seems to be solving the issue. It is not that difficult to plan when you are creating an eCommerce website, or change once it has been built, but people still are not dealing with layered navigation.

For those that are not sure what I mean by layered navigation, I am talking about the filtering system you see on most, if not all, eCommerce product listing. It is the navigation that allows you to filter down to brand, size, colour, reviews, etc.

This, alongside product pages, is one of the most common issues causing duplicate content on eCommerce websites. If you are an eCommerce store, 9 out of 10 times if you conduct a site: search in Google, you will see a lot more pages indexed than you would expect. This is likely to be down to issues with layered navigation.

Providing the user with the flexibility to be granular with their filtering is great from a user perspective and one that I fully support. However, they need to be handled correctly.

Here are three examples of issues you will find with layered navigation and how they could be solved.

Product listing pages:

If you provide the user with the functionality to change the number of products that are being viewed within the listing, then you need to ensure that only a single URL is being indexed.

The most common way of handling this is by adding in the rel=canonical tag. The only question you need to ask yourself is which page do you want to be indexed? On most eCommerce solutions you have the following options:

  • 12 (default view)
  • 24
  • 48
  • View All

Depending on the speed of your website I would either rel=canonical to the default view or the view all page, but I would definitely have one. If you do not include a rel=canonical tag then all of these pages will be indexed for every single variation of filter you can imagine for your website. That is a lot of extra pages!


You do not want and/or need all of your filter options to be dynamic. You would expect brand terms to be static URLs rather than dynamic URLs. There are likely to be other filter options and this does depend on the website that you are working on, but keyword research can help you with this.

However when allowing users to filter by items such as colour, size, price and review, you are likely to want to have these dynamic, with a rel=canonical tag added.

Example below.

  • www.domain.com/product/brand/ – This is fine to be kept as it is.
  • www.domain.com/product/brand/?=colour – This should have the following canonical tag added to it –
  • www.domain.com/product/brand/?=colour&?=size – This should have the following canonical tag added to it –
  • www.domain.com/product/brand/?=colour&?=size&?=review – This should have the following canonical tag added to it –

*Note: All eCommerce sites are different and keyword research should be carried out to determine the type of pages that are delivered by static and dynamic UR£.


This can be handled in two ways, either canonicalising all pages to a single page, usually the View All, or using the rel=next/prev feature that is available.

The option that you take here is very much dependent on the speed of your website and the amount of products you have available. Google prefers to surface the View All page, and if there are less than ten pages I like to rel=canonical to that page. However if there are consistently more than ten pages, I implement the rel=next/prev tag to indicate to the search engines they are the same page.

You can find more on the Google Webmaster Central blog.


Robots-txtWhen was the last time you honestly looked at your robots.txt? Have you ever looked at it? You are not alone, a lot of people have not. The robots.txt file provides you with the ideal way to restrict search engines from accessing content or elements they do not need to see.

It is important that the robots.txt file is understood and utilised as much as possible. Adding in rogue folders and files can have a serious impact on the way that your website is being crawled.

If you are looking for more information on how to use the robots.txt file, then Google has provided a resource for you – https://support.google.com/webmasters/answer/6062608?rd=1

Schema Mark-Up

I attended a conference recently where the presenter asked how many of us are using schema markup, only four people raised their hand. Four people out of a room of nearly 200 people, I was astonished.

For eCommerce it is essential, and I cannot recommend it enough to any of my clients. Not just because we have entered the world of structured data and we need to provide the search engines with context about what we are trying to say, but at present it still differentiates your website in the SERPs.

There are a range of schema markups that are available, so you do not have the excuse of saying ‘I don’t work on an eCommerce store’. To find out more information then take a look here – http://www.schema.org/ and if you are looking for help to create your schema then here is another handy tool – http://schema-creator.org/.

If you only take a couple of recommendations away from this post, I would strongly recommend you solve your layered navigation issues and implement schema where possible.

Do you often miss these four technical SEO features? Are there others that you feel get missed when auditing your website from a technical perspective? I would love to hear your feedback in the comments below or on twitter @danielbianchini.

[Image credit: The Guardian]



Daniel Bianchini is the Director of Services at White.net, a creative digital marketing agency based in Oxford, UK. Having been in digital marketing since leaving University, Daniel has worked in-house at Dixons Stores Group (Dixons Carphone), with many leading UK brands and helped start a digital marketing agency based in Hertfordshire.
  • Good post Daniel.

    Wanted to pick your brain on the listing/filter section.

    My client uses a bespoke CMS system which allows for URL’s with filters applied to be indexed – Example” http://www.brand.co.uk/product-a/filter1 would be a page and has the option to be crawled and indexed.

    Currently set to off right now but I think there’s some opportunity here to grow search traffic. When you’re on the Product A main page you have the option to choose a colour.

    If you select grey the URL changes to http://www.brand.co.uk/product-a/grey and the page title and headings all change too. This is a landing page for someone searching for “grey product a” or “product a in grey” right?

    I think this is a good idea – but would you recommend using rel=canonical tags on these pages too? Even though they change in content, headings, images, titles anyway?

    Cheers Dan!

    • Daniel Bianchini

      Thanks Pritesh, appreciate it.

      If I am understanding you correctly, you want to know whether to use the rel=canonical tag on pages such as http://www.brand.com/product-a/grey.

      If this is correct, then for me it would depend on the following: Is there search volume for those products in grey? And are there multiple products? If yes to both questions, then I wouldn’t use a rel=canonical tag as I would want this page to be indexed. If you say no to either of the questions asked above, then I would likely add a rel=canonical tag to the page keep it mor focussed.

      I hope that helps 🙂


  • Haresh Pansuriya

    really great post Daniel. But honestly who has time to do this?,,,,thanks for sharring

    • Daniel Bianchini

      Hello Haresh,

      Thanks for the comment. I think that is the point, you should be making time to check these things. Websites are not just about the shiny content, but also ensuring the technical features work well for people to see it.


    • Websites should be built with SEO in mind from day one but often (mostly?) they are not. This can lead not just to a lack of reaching organic search potential but also to issues that can increasingly damage performance if they are not addressed.

      A lot of the time the problems lie in what I consider to be basic development best practice (such as canonical redirects) but often the 1st people to pick up the issues are tech SEO pros. Really if every dev just used a simple SEO checklist a lot of problems could be avoided.