5 Reasons You Lose Traffic After a Website Migration & How You Can Prevent It
Search Engine Optimisation

5 Reasons You Lose Traffic After a Website Migration & How You Can Prevent It

19th July 2016

There comes a time in most websites’ lifetime when they will go through the dreaded website migration.

Website migrations are one of the most difficult technical processes to go through regardless of how skilled you are in digital marketing or development. You are taking a website that (hopefully) has stability, and you are making a huge change. This can cause a host of issues and impact the business objectives regardless of how thorough your planning and implementation has been.

In this post I am going to talk through five of the most common reasons that you may lose traffic to your website during your migration, and the steps you can take to try and prevent it.

Lack of respect for redirects

301 redirects are standard practice when it comes to website migrations. You find all of the pages on the current website, and you redirect them to the new location with a single hop (okay, there is a bit more to it than just that). Sounds simple right?

Well…

What happens when your development team decide to skip that part, and any of your recommendations/hard work and just put up the new website?

You get Google (other search engines are available) spitting their dummy out about the number of pages causing 404s, and your SEO having a heart attack as the 404 count rises by the 000s every day!

stod - crawl errors

Inevitably, if this issue is not turned around quickly, you start to lose visibility within the search engine. This leads to a decrease in organic traffic and the potential loss of conversion. I don’t need to tell you that this is not a good position to be in.

So how do you ensure that this does not happen?

Firstly, you need to have or build a good relationship with the development team working on the project. Go and buy them coffee, help them out, make friends. This will stand you in good stead, not only for the migration but for other technical changes you require.

Secondly, you need to ensure that you have conducted a thorough crawl of the website using all the tools available to you. I tend to use a combination of the following:

These URLs then need to be mapped correctly to the new location using a single 301 redirect. I would suggest that you use rules where possible to reduce the number of individual redirect calls being made.

Thirdly – and here is the important part – Test these redirects work on the staging environment. That way you can check to ensure they have been implemented correctly and that they’re behaving how you would expect them to. Once you are happy with these, double check them on the launch of the new website to ensure they have been moved across and continue to monitor them over the next few months.

2. Google taking time to recognise redirects

Recent experience has indicated that Google is taking longer than it used to to recognise redirects and changes made during a site migration, which is then not reflected in the index.

The chart below shows how Google is indexing the new and old versions of a website over a two month period. Although I would expect to see fluctuation over a period of time, previous migrations have seen a much quicker change, with Google quickly reflecting the new URLs within the index.

stod - indexation

There are a number of reasons why your website may have a lower indexation number compared to your previous website. But it is essential that you figure it out.

At this stage, most people will just refer to visibility tools as a measure of progress, such as the one shown below. Although it is good to see how you compare to the previous state of affairs, you need to keep an eye on your internal data.

stod - search metrics

Tip: Don’t look at the visibility graph and take it at face value, dig in to see if you have retained similar rankings. It is great to have a similar or better looking graph, but absolutely pointless if all the terms have dropped to page 2 or beyond.

So how do you help speed up indexing process?

This is one of those times where you are in Google’s hands, waiting for them to recrawl the website and reflect that in their index. I would, however, suggest that you do the following to help as much as possible:

  • Use GSC website address change tool (if applicable)
  • Upload new sitemaps to GSC – I would also upload the new XML sitemap to the old GSC account.
  • Regularly review the new XML sitemaps and pages/sections within GSC that are not being indexed. Identify the key areas and use the Fetch as Google feature to submit to Google.

3. Removal of pages

It is common during a website migration for the information architecture of the website to change. Moving to a new website/domain provides the perfect opportunity to improve the way users and search engines can get around.

It is as this stage, and before the pages have been removed, that you understand the impact those changes will have on the business objectives.

Take a look at this somewhat fictitious exchange:

Client/Stakeholder: “I am going to remove X pages during the migration as they are not converting.”

You: “By doing so you will lose X% of traffic across all channels with the likelihood of losing organic visibility, which in turn will affect conversion.”

Client/Stakeholder: “That’s fine, as they are not converting directly and therefore the traffic is not qualified.”

You: “But this will also have an impact on your assisted conversions, I would suggest that we combine these pages where possible.”

Client/Stakeholder: “I understand, but I am going ahead.”

Website launches:

stod - removal of pages

Client/Stakeholder: “We have lost lots of traffic and the board are going nuts!”

You: “Face palm! – How are the conversions?”

Client/Stakeholder: “Down! WTF!”

So how do you reduce the potential of this happening?

Do research! And do it thoroughly. If you and/or the client want to remove pages then you need to really understand the impact that it will have. Information that you want to be able to present back to the client / key stakeholders include:

  • Impact on key metrics such as conversion / traffic.
  • Potential impact on search engine visibility. Losing pages will mean the potential loss of keyword rankings.
  • Alternative solutions if relevant. Can you combine some of the pages to make them more relevant? Can the pages be improved to help improve conversion?

4. Crawlers being blocked through Robots.txt & NoIndex tags

As standard practise, you should ensure that any new website is not visible to users or search engines whilst it is going through the development stages. As you can see below, this is not always the case.

stod - blocked robots

You could conduct a number of searches in Google right now and you will find an array of websites with their development or staging website’s index. Go take a look and try the following:

  • site:[INSERT DOMAIN] inurl:staging.
  • site:[INSERT DOMAIN] inurl:dev.
  • site:[INSERT DOMAIN] inurl:uat.

How did you get on? Find many?

More importantly, how does this mean that you lose traffic? Well IF standard practice has been followed you should not see any of the above, as your development team would have added both Disallow: / to the robots.txt file and the meta NoIndex tag to every single page BEFORE a search engine could crawl it.

Some people might say that this is overkill, but for me I would want to ensure that nobody out of the confines of the business and any external partners know what is coming. I would even suggest that the website is placed behind a gated wall and is IP restricted to those trusted few.

Anyhow, I digress. The issue of traffic loss arises when you move the website from development to a live environment. It is at this stage that small details are often missed, notably the removal of the NoIndex tags and the Disallow: / command in the robots.txt.

If these tags are not removed from the website on launch, then you are going to be in a bit of trouble. Your meta descriptions will start to indicate the pages are being blocked by the Robots.txt and after a while (if not resolved), your pages will start to drop from the index.

So how do you stop this from happening?

This one is easy, at least I would hope so. On launch of the website have a check of the Robots.txt for the Disallow: / command blocking all robots. I would also recommend that you run a crawl of the website and pay special attention to the NoIndex tag.

5. Lost ALL traffic

One basic mistake that can be made is not moving across or adding in your analytics. I recently came across a website that had gone through a migration and lost ALL of their traffic.

Traffic Loss

As you can imagine they were in despair, so when I pointed out that they did not have any tracking code on the entire website they were very annoyed, but also happy that they had not lost everything.

But why does this happen? Surely you would expect tracking to be added as part of the course.

Well, in my experience that has not always been the case. Depending on the migration type, and whether you are having a new website built, you need to specifically request that the tracking is moved across.

How can I prevent this from happening?

I would suggest that you use Google Tag Manager and have this implemented on the website throughout the development process.

From here you can do things in two ways depending on how comfortable you are with GA and GTM.

The first option, and probably the simplest way, is to ensure your GA code has been implemented within Google Tag Manager but hasn’t been published. Then on launch, all you need to do is to publish the tag to get ensure you are tracking continuously.

The second option, and the one I would generally plump for, is a little more involved. I am keen that all my tracking is in place before the website is launched, and therefore I would want to test events, goals, eCommerce if applicable, etc, but I don’t want that skewing any live data. Therefore, I would do the following:

  1. Create a new GA account specifically for staging environment or use an existing view and filters.
  2. Publish the tag containing the test profile and begin testing.
  3. Once happy, and on launch. Remove test tag and implement tag with the live account details.
  4. Create annotation in GA to highlight the change in website.

But that’s just me. 🙂

There you have it, 5 reasons you could lose traffic during your site migration and how you can prevent it from happening. You may think that these are very basic issues, and I would agree. However, they are being made time and time again because they are small details that people forget during such a large and data intensive process.

I would love to hear about your migration, and whether you came across any of the things I mentioned in the comments below.

[Image credit – chintermeyer]

Tags

Written By
Daniel Bianchini is a freelance SEO & digital marketing specialist based in Oxford, working with a range of agencies, in-house teams and SMBs.
  • This field is for validation purposes and should be left unchanged.