How we got our rankings back with mainly technical changes

I would like to share a case with you. A case that got me thinking about the importance of good coding. Some time ago we made some changes to the website of a supplier of stoves and fireplaces. The website has a good amount of authority within their business and although the website ranked very well on the main product categories, individual products and product types didn’t come up in the search engines as often as we wanted.

The current website was mainly focused on company information with a separate part for the online product catalogue. We decided to create a structure with a major focus on the main product categories, a filter structure that would be optimized automatically for the product types and a better structure to increase indexation of all of the products within the online catalogue.

The goal of the changes was to drive more organic traffic from searches for specific products and types of products and of course to retain the current rankings on main categories. Together with the client we invested some valuable hours coming up with a perfect structure and tackled problems like optimizing pages where combinations of filters were used.

Because the structure would change significantly, the URLs also had to change. Therefore we chose at the same time to implement the cleanest URLs we could create. Of course we redirected all important old pages to their new locations to retain the rankings.

After a few weeks we completed the plan for the website and started building the new version. When it was completed we were confident it would mean a great improvement for the visibility in the SERPs for products and categories. And after the launch it quickly became clear that we were right: organic traffic focused on products and product types increased steadily.

It took some time for the redirects to get picked up by the search engines, but after a while most of the new pages were indexed. To our surprise however the rankings for the main categories didn’t get passed on to the respective new pages. They dropped on average 30-40 positions, away from the first pages back into forgotten areas of the search engine index where day light hardly travels.

At first we couldn’t find a clear cause for this drop in the engines. But after an extensive analysis we thought there might have been something wrong with the way the category pages were built in the back-end. The structure of a page to a visitor was like this:

But after turning of CSS, and therefore showing the page as search engines see it, resulted in the following structure:

Well you can see the problem here. First of all, although all the necessary information is presented on the category page, the valuable information for search engines is presented at the bottom of the page.

Secondly, the most important pages on the website (the categories) are the last presented links on the page. Therefore search engines could view them as least significant for users.

Thirdly, because of the high position of the news section in the structure Google also presented the date of the top news item in the search result (this is also what triggered us to check the source code). Google might perceive the page as a news page instead of a regular content page.

We decided to make a few change within the code to present the information in the proper sequence. This resulted in the following structure:

This structure is clearly focused on the important content and important pages. So with eager anticipation we awaited the indexation of the new version of the page. A few hours later the new page was indexed and immediately the rankings arose from pages 3-4 tot the first SERP.

Now, the old rankings are restored almost completely and traffic is restoring as well. With the changes for the products and filters organic traffic is higher than ever before. And that without changing anything to the content or incoming links. Visitors can hardly notice the difference, but search engines apparently all the more.

So, although you might have a well structured page for your visitors, how search engines perceive it is also very important. Take a look at your site the way search engines do by disabling CSS, checking the text version of their cached page or use a tool like SEO browser.

About Jeroen van Eck

Jeroen van Eck is a consultant search engine marketing at the online marketing company E-Focus in the Netherlands.

20 thoughts on “How we got our rankings back with mainly technical changes

  1. Thanks for sharing this case study Jeroen. It’s indeed a great example of the value of well-structured HTML code and shows how search engines can (mis-)interpret a page’s relevance if the developer relies too heavily on CSS and not enough on the code structure as it is seen by crawlers.

  2. Thanks man! To be fair I was at first a little surprised by the impact it had. But if you look at it a little better, you can see why search engines would have a hard time reading the page correctly.

  3. Not surprised at all. I’ve long believed that the positioning of content using CSS can have a big impact on rankings. It’s great to have it documented so thoroughly and I’ll be using this article to add weight to my negotiations. Thanks!

  4. Minor changes like this can mean an different of first page and page all helps and we should be looking to tweet our sites to see what works and what just an waste of time..

    “Black Seo Guy “Signing Off”

  5. We have the same problem with a webshop right now. In May 2010 they have build a new site and since everyting is going down. Search results and e-commerce. By reading this post we’re going to change the HTML.

    Thank you Jeroen

    Greetz Jeroen

  6. This is something that I have had first hand experience with and it is surprising how so many people can get this so wrong. It is really good to have some clear research which shows how this impacts. One day people will realise how important the minor seemingly insignificant things actually are. Great article!

  7. Small changes can really make a big difference when it comes to coding. Every detail matters in SEO, so it’s really important to test and do some research before you decide on something. By the way, impressive article!

  8. Nice practices, Jeroen,

    Looking at the website with crawlers’ eyes an effective way to know what the problem is.

    Besides tweaking the site structure in CSS files, making div ids, classes and other tags easier for crawlers to “comprehend” can also be powerful in informing crawlers which part is important and which not, like using “nav” for navigations, “content” for important content on the page, etc.

  9. Holy Hell, I think I am going to need to read this like 15 times. I’m a content man. I can pump out some pretty killer stuff, but when it comes to google spiders, and trying to “see things through their eyes” my ADD kicks into overdrive.

    Don’t spiders have like 8 eyes anyways? Figuring out how to see from their point of view seems like it would give someone a headache.

    Anyways… thanks Jeroen, good stuff, I’ll try to read it again sometime when I’m not on 2 hours of sleep, and see if I can understand it.


  10. I always make sure to see the cached versions of my site, that way we can find out if there is any thing wrong with the way the search engines sees our website. About the url changes and re-directs, it was msn search where i had lost all my traffic, yahoo restored a bit, and in time Google had almost returned the total traffic on new pages.

  11. Thanks for sharing this case study.
    So if a site wants news to be considered by Google News, it is better to put them top of the page or in your last diagram, below?

  12. Thanks all for the reactions! I guess this is still a big issue for many people. For everybody who made changes because of this article, please let us know the results!

    @Antonio for ranking in Google News I suggest you read this article by Barry Adams: I think it is always important to position the information you want to rank for at the top of the page. This is for news the date, the topic and a short introdcution.

  13. Great case study. If a page isn’t ranking as well as it should be, it’s always a good idea to go back in and look at the code and analyze it. The diagrams you provided here are helpful.

  14. The diagram clearly shows that it was the “news” section that was preventing Google from indexing the content. It’s amazing what a bit of tweaking the code can do to affect the rankings.

  15. what about the premise that when you do mass redirects, the new url’s are automatically dropped down several places (if not pages) on a TEMPORARY basis whilst the changes are validated against longevity (ie google waits to see if the redirects are going to stay there, before reinstating them in original/improved positions in the serps).

    Is there not a chance you acted too soon, and the positions would have reverted anyway?

Comments are closed.