Clicky

X

Subscribe to our newsletter

Get the State of Digital Newsletter
Join an elite group of marketers receiving the best content in their mailbox
* = required field
Daily Updates

Best of 2013: No14 – How We Got Our Penalty Revoked Using the Disavow Tool #bestof2013stod

23 December 2013 BY

91 Flares Twitter 44 Facebook 3 Google+ 39 LinkedIn 4 Buffer 1 Email -- StumbleUpon 0 Pin It Share 0 Filament.io 91 Flares ×

During the holiday weeks we will be showing you the 15 best read posts of 2013. Except for on Christmas and New Years day, each day you can read the best articles again, going from number 15 back to number 1.

Now it’s time for number 14, originally posted on February 7 2013, a guest post by Sander Tamaëla, one of the two guest posts in the top 15!

This is a guestpost by Sander Tamaëla, a freelance SEO Consultant

In October 2012 a new client approached me with a question: Can you help me with trying to revoke a penalty? It happened to be that Google had just released their link disavow tool and this penalty was link based. In this post I want to share my three months of experience with using the disavow tool to get a penalty removed.

It all started off in early 2012, my client signed an SEO deal with a no cure no pay SEO agency. The problem with no cure no pay is how these companies seem to work:

• Try to gain rankings (yes, rankings is their choice metric)
• Fail to meet goals, but still want to get paid
• Bring out the less legitimate tactics without the client knowing the risks
• Get rankings
• Get paid
• Lose rankings
• Lose client

 

 

ga

In April the spamming started to gain traction, traffic started to grow. It peaked in the second week of May, after that it got steadier. In August it somehow peaked again, then dropping and gradually gliding back to the traffic from before the spamming. I continued to drop a bit more to an all-time low of around 75 non-branded organic visitors. Please take in mind that it’s a small niche with a very high value per visitor.

I knew we could try to use the disavow tool, but I wanted to take it slow on what links and the number of links we would disavow. Next to that I wanted to know if this tool is of any use by itself without doing what Google preaches: first contact all webmasters before using the disavow tool to remove the remaining links.

unnatural link warning

unnatural link warning (in Dutch)

First reconsideration request

We started of with the data from Google Webmaster Tools. We checked all ~6000 links on any of the following points:

  • Are they editorial?
  • Are they in the comments?
  • Do they look paid?
  • If they don’t; is there a sign that points to a way to buy a link?
  • Do they look inserted after the first publish date of the post?

After double-checking we uploaded the first file to the disavow tool. Then we waited for about two weeks. (We did a reconsideration request after every disavow.)

The first rejection came the 26th of November, the default reply stating that the site is still violating the Google guidelines (in Dutch):

Second reconsideration request

Hmmm, we must have missed something, right? We downloaded the links from GWT again and checked (with vlookup in Excel) if there where any new links. There were, only as much as five, but there were. Marked them as spam, combined them with the old list and uploaded a new file to the disavow tool. We immediately did a second reconsideration request and the waiting for a new reply started.

Around two weeks later we received the same reply: your site is violating Google’s guidelines. Please check the links.

With getting only the default replies from Google we weren’t getting any further. I was wondering what could cause this problem and ended up with three possible options.

Time between uploading the links and doing a reconsideration request

Maybe there should be more time between uploading the links to the disavow tool and doing a reconsideration request. The systems could be out of sync somehow or maybe Google just wants you to take your time.

Really? Google developing a system that would take them extra hours to check/sync or whatever without automating it? No way, this needed to work out of the box. Giving users a way to generate a flood load of new data without automating wouldn’t make sense.

The disavow tool not working?

My client was convinced that the disavow tool was just not working. Next to that other SEO’s were stating that this tool was just a way of Google gaining more knowledge about spammy links and use it to generate a better way to create a new fingerprint for identifying new spam.

I was (and still am) convinced this would not be the case. User generated fingerprints could not be that useful, in the first place because of the noise and in the second place because it would give users a new way to influence the algo in a wrong way.

Data

But what if Google sees new links appearing now and then despite you’ve stopped your spammy link scheme? New URL’s on the same websites that contain the same (duplicate) content or comments with your spam links in it. Sounds plausible to me.

The problem: data

Data. This must be the problem. So we did download all backlink data we could: Ahrefs and Majestic SEO teamed up with GWT to collect the largest dataset we could get our hands on (Opensiteexplorer.org was of no additional value in this case).

Again Excel comes in handy: using vlookup to identify unique links and domains across different datasets. The problem here is the limited extra data that GWT gives you: none. I wanted to know the anchor text of the links that were only in the dataset of GWT. To get my hands on this data I developed a small tool that crawls the unique GWT links to get the anchor text of the links pointing to my client’s site. We were then able to group links by anchor text and saw one keyword that was used for over 40%. This was the keyword that was hit the hardest, although first to second page wasn’t that shocking. But this keyword was stuck at the same position, on-page or changes or links wouldn’t make it move up or at least not for longer than a day. It kept dropping back to the same position.

Editorial Note: the tools Sander discusses here are in pre beta, but he was willing to share them:

  • http://metsander.nl/tools/domains-to-ip/ Turns domains into an IP adress and Class C subnets. With this tool you can transfer all your domains from GWT to IP adresses and Class C subnets and thus filter them in Excel. It automatically filters duplicate entries from the input domains.
  • http://metsander.nl/tools/url-to-anchor-text/ You input a list of URL’s of which you have backlinks, secondly you input the target URL. It will return the anchor text, very handy to add to GWT data (especially because it is different data than for example Majestic/Ahrefs)

Next I wanted to know if we could identify more domains to disavow. We gathered all IP addresses of all domains and grouped them by IP and Class C. The results weren’t that shocking, the IP’s seem to be quite random; the same goes for Class C subnets. Despite this I wanted to know if there where more spam domains on the IP addresses. Bing has this ip: operator that returns sites running on this IP. Although this was a hell of a job, I wanted to know for sure that there weren’t other tactics in place that would ruin all our efforts. The good: I couldn’t find any other sites that I would classify as spam domains. The bad: It was a lot of extra work.

So now we had a complete list of around 7500 backlinks and more than 600 IP addresses.

Example of the IP_ parameter

Using the domain: operator

example of domainoperator

example of domainoperator

The next step is critical and a bit of a less known option of the disavow tool, the domain: operator.

This operator gives you the option to not only disavow a single link, but disavow a complete domain. Why is this useful? Because Google keeps crawling the domains with unnatural links and keeps discovering the same link on different (duplicate) pages. If there wasn’t an option to disavow a domain then things would take a lot longer than the three months we spent recovering from the penalty.

Between the first and second reconsideration requests, Google found a bunch of “new” links, despite that the spamming stopped months a go.

A small word of warning: You should never blindly disavow all domains! This could be a huge mistake if you ever want a legitimate editorial link from that domain. It would be worthless. So think before you act, even if your client is pushing your buttons and trying to force your hand. Remind your client that if they want to grow their conversions and traffic from Google’s organic listings this step should be double-checked, at the very least.

We were in the lucky position where the spammed domains where off-topic and mostly in another language, other than the native language of my client site. So picking the domains to completely disavow was pretty easy. If you’re not in that position you should always ask yourself if a link of that particular website could be of any value in the future.

After all of the above we ended up with a list of ~500 domains and a couple of separate links to disavow.

The final reconsideration request

Despite the common belief that you can only do so many reconsideration requests in a certain amount of time, we did 3 in a short period. Why do I think this isn’t a problem? Because with each submitted reconsideration request we did a detailed explanation of the steps we took. We were honest, showed that we really did do some effort to fix the problem; we even named the two parties responsible for the spam.

Next to that you should convince Google why they should trust you. What did you do to prevent similar action from taking place in the future? Tell all of your story, but don’t make things up.

Finally, don’t let the SEO/marketing guy do the reconsideration request, ask the highest responsible person to sign the form you’re sending (in this case the client director did).

We submitted the final reconsideration request in the beginning of January and got the final words 3.5 weeks later (in Dutch):

final message

 

The message above tells us:

  • The penalty has been removed
  • It wasn’t an algo penalty but a manual action against the site
  • It could take some time to see the changes in the search results
  • There might be other factors that negatively influence the rankings
  • Google thanks us for all the effort we did to ensure the quality of the search results

What did the spammed keyword do in ranking? At first it was dramatic: second to fourth page over complete 3 months period. But within a couple of days we saw some recovery, it now ranks around place 15. I’ve advised my client to keep a low profile for the next month, but I think the road is paved to build more traffic in the future.

Lessons learned

TL;DR, here are the learnings:

  • The disavow tool works if you use it in the right way
  • Blindly uploading all data from GWT into the disavow tool might not help you at all because of missing or new data
  • Using the domain: operator is very useful, but be careful not to over use it
  • Using the domain: operator is not always an option if you spammed sites of which you do want a link in the future.
  • Be patience; take your time to analyze. Don’t rush in; it might make your life harder if you want to grow traffic in the future.
  • Keep in mind that recovery is possible.
  • It’s all about data. The larger the dataset, the bigger the change of recovery.
  • I don’t think you need to take manual actions toward removing the bad links.
  • In this case the penalty wasn’t that major, but the keyword was stuck at that position. Don’t think you can link build your way out of this.
  • Google is smart. If you take the Panda/Penguin release cycle and the effort and time it takes to recover, it scares away the wannabe grey/blackhatter. I’m still wondering how many times you can screw up and recover, although I’m not testing this ;)
  • Doing link spam to take larger chunks of traffic for a longer period of time might not be that feasible.
  • Try to learn from each step you take. Other clients might profit from your well documented experience.

What grey/black SEO in particular might learn from this

If you’re into grey/black and don’t mind burning a domain you could still make large amounts of cash by focusing on seasonality. You can still rank long enough to rank high during the holiday season. Just calculate the risk and don’t put all you eggs in one basket.

Where Google might take the disavow tool

If I were Google I would take the disavow tool to the next step.

Sandboxing after successful recovery

What if you would receive a message that the penalty has been lifted but you where sandboxed for a month (or two) before any new efforts would gain any traction? You would be happy that your penalty has been lifted. But Google has the advantage: they can update there algo and release a new Panda/Penguin and penalize your site again while it’s in the sandbox. You wouldn’t benefit from all the effort you put in to your recovery.

Limit the number of domains to disavow

Although using the domain: operator might be a risk by itself, it is a powerful option to use. If you don’t care about the ability to get a link that does count from a domain in the future, you can use the disavow tool with the domain operator on your full link profile and then continue to build links from other sources that are of equal low quality. You can burn a domain for at least a second time.

But what if you are limited to, lets say, ~200 domains per disavow/month? It would take you more time and effort to recover and, most important, it would be harder to recover if you screwed up big time. Next to that, scalability: scaling the number of people helping you to recover is of no use because you still have to wait.

Devalue domains that are disavowed a lot

Google say they don’t do this already, but I think they should. They already devalue a lot more links than they used to do, why not devalue domains on a larger scale? Of course they should take other non-user generated signals and combine them with the disavow list, but the idea is pretty clear if you ask me.

What’s your experience with using the disavow tool?

Although I’ve seen all of the above work out just fine, I am aware that there might be other penalties that can’t get lifted the way I just described. What did you try and did it work? If it didn’t, do you know why? Love to hear more examples/experiences with the disavow tool: drop them in the comments! I’m willing to learn more, but also help you if I can.

I’ll try to keep you posted on this particular case.

AUTHORED BY:
h

This post was written by an author who is not a regular contributor to State of Digital. See all the other regular State of Digital authors here. Opinions expressed in the article are those of the contributor and not necessarily those of State of Digital.
  • Bas

    Hi Sander,

    Interesting case and thanks for sharing! Did this customer also recieve a penguin warning in WMT last july?

    • http://twitter.com/tamaela Sander Tamaëla

      No, the message in September was the only message they received.

  • Fili Wiese

    Someone on G+ asked me (as a response to sharing this great article): Did you return to previous level of traffic after using the Disavow tool? Previous being the traffic you saw before you saw the need to use this tool.?

    • http://twitter.com/tamaela Sander Tamaëla

      It over performed. We regained traffic faster than I expected. Although the week has a few days left, we’re already at an all time high of around 600 visits non-branded visits (see the GA screenshot above for comparison). All of the important keywords are still climbing, first without on-site optimization, the last step with.

      • Peter

        Great to hear this. Do you have to submit a reconsideration request or can this just happen naturally?

        The reason for this is that I submitted a disavow request today because of spam links, but I wasn’t actually contacted by Google in the first place.

  • Pingback: Gastbloggen. Wat er wel en niet mis mee is. - I've got something to say!()

  • Jerry

    I’m trying to recover a site at the minute that has been destroyed with thousands of awful comment spam links with exact match anchors. I’ve done 3 reconsideration requests highlighting the 3000 links we’ve been able to remove, and the ones we’ve had no response from. I’ve tried disavowing these and it has made no difference.

    I’m at a loss as to what to do next – we can’t find contact info for half of these Taiwanese websites that have comments on, and chances are they’re never going to remove 500+ comments from posts from 2/3 years ago

    • http://twitter.com/tamaela Sander Tamaëla

      Have you tried combining data from different sources and using the domain: operator? Next to that: be honest in your reconsideration request. Even if you didn’t comment spam, you might have benefited from extra traffic Google gave you for a period because of the spam. Be thankful (and tell Google) but realize you might not get back to that level after your penalty has been revoked. Just tell how you’ve experienced every part: from a month before the penalty, the penalty itself and what you have done to solve the problem (hours and details). And just realize that’s it’s Google’s index and a Google employee will read your request.

      • Jerry

        Thanks for the reply. Its for one of my clients – a previous company had purchased lots of comment links (10,000+) – we’ve tried combining links from webmaster tools/open site explorer/Majestic and Ahrefs, we’ve documented when we contacted each site we could find contact info for, a 2nd contact date, the response. We had separate google docs for the sites we removed successfully, but it still ruins the link profile totally with thousands of unwanted links

  • GC

    I would disagree with not using the domain: operator. The sites you are disavowing are most likely sites you would never want to get a link from anyway – otherwise I’d try a different tack with changing the link – like contacting the site owner. I wouldn’t shy away from using the domain: operator at all. My (successful) disavow request contained an entire list of domain level disavows. (obviously, analysed for disavowal for reasons not dissimilar to those described above) Also it only included links data from GWT though I’m sure including a wider variety of sources won’t hurt either

    • http://twitter.com/tamaela Sander Tamaëla

      I think you’re exactly saying what I tried to say in the post above: if it’s easy to pinpoint the wrong links and you don’t need a link in the future, remove or disavow the link.

  • Pingback: Ecommerce Maestro Read the SEO News; Its a Great Use of Your Time. - Ecommerce Maestro()

  • http://www.facebook.com/people/Chet-Jariwala/100000515707100 Chet Jariwala

    What if I didn’t receive a unnatural links message in gwt. However I know my site has been hit by penguin for sure (traffic dropped exactly after first penguin launch), so can I still use disavow tool even if I got no manual penalty message???

    • http://twitter.com/tamaela Sander Tamaëla

      I would approach this slightly different. First fix all other possible negative on-site factors, do a reconsideration request. If the response is negative, read carefully what the message says, it might point to your problem. If it doesn’t, try removing links and in the end disavow the rest. Repeat till successful (it might take a long time).

    • Rank Watch

      Traffic dropped exactly after first penguin launch might just be a co-incidence. If you firmly believe that your site has some spammy or automated back links and is a content farm just for the sake of running the ads or if you are producing low quality content adding no/less value to the users then it might be hit by Panda or Penguin. In some cases the ranking drop down might be due to your competitors having a relevant landing page for the keyword getting authoritative natural backlinks from relevant sites and moving up in the rankings. So as Sander mentioned consider checking your site against the webmaster guidelines once before proceeding further to disavowing the links and then a reconsideration request.

  • Pingback: Internet Advantage weekly sum-up week 6 - Internet Advantage()

  • http://www.hiswebmarketing.com/ Marie Haynes

    Thank you for posting this. Case studies on reconsideration are always valuable. But, I felt I needed to comment because something doesn’t make sense here. Google is pretty clear that it is not enough to just disavow links in order to get a manual penalty removed.

    Here is a quote from John Mueller: “We know that perhaps not every link can be cleaned up, but in order to deem a reconsideration request as successful, we need to see a substantial good-faith effort to remove the links,” (Source: http://productforums.google.com/forum/#!searchin/webmasters/authorname:johnmu$20$2B$20reconsideration/webmasters/kYeXW-c_8aI/vHApywen7MAJ)

    And, from the Google Webmaster Blog: “If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.”

    If this is the case, then this would explain why your site failed reconsideration initially. But, why then did it pass eventually without you removing links manually? My theory is that many of the domains linking to you may have gone offline. When I do reconsideration requests and visit domains (to assess whether or not they are natural links and also to gather contact information) it is amazing how many of these domains give a 404 or “page not found” message.

    There is a Google webmaster video in which Matt Cutts explains one part of the reconsideration process in which they first take a look at a subset of your unnatural links and see how many of them you have gotten manually removed before deciding whether to reconsider your site. I don’t know if this subset is randomly obtained. If so, then this could be the reason for you finally passing – i.e. if they sampled a certain number of links and they were gone because the sites had gone offline.

    In any case I felt I needed to comment because people need to know that in most cases, a good number of links need to be manually removed before a site will pass reconsideration.

    • http://twitter.com/tamaela Sander Tamaëla

      I agree. Since this post I’ve been working on several other unnatural link penalties and most of them indeed need the extra work to get the link removed (or at least a list that’s shows you’ve tried). What it teaches me? Don’t go all in just on a single post and keep thinking.

  • Pingback: Houdt Google te vriend, gebruik SEO goed! | Daan Janssen()

  • Pingback: IMCyclopedia | Ask the Ex-Googlers Anything – #brightonseo()

  • Martijn

    Thanks for your article. My question is almost the same as Chet’s. The traffic on the website of one of my clients dropped ~45% when Pinguin was released. Although they didn’t receive any message in GWT, we are sure it’s due to anchortext over optimalisation and shady backlinks. In your reply on Chet’s question you replied that you could ask for a reconsideration report. We don’t to wake up a sleeping giant. Although it’s 99% sure they have bad backlinks, we don’t know for sure if they were penalised. What’s your opinion on this one?

    • http://metsander.nl/ Sander Tamaëla

      Don’t use the disavow tool to start with. Try to get the bad links removed by contacting the webmasters. I’ve been working on another case where this helped a lot.

    • Sander Tamaëla

      If you’re sure it’s an algorithmic penalty you should solve all issues and wait for the next iteration of Penguin.

  • Alex Smith

    I am new to SEO and I am interested in your case since we are trying to clean up our link profile. How exactly did you go deciding which links were unique link and domains across the different data sets using vlookup. I guess my question is basically how did you decide which links to do more in-depth analysis on?

    • http://metsander.nl/ Sander Tamaëla

      Manually review each link and classify them into: spam, paid, unnatural and natural.

      I know that unnatural depends on your own view, but just ask yourself: is it a high quality site and why is my link there?

      Hope this helps.

  • igl00

    good to hear it was possible to lift the penalty

  • Pingback: mouse click the following post()

  • Pingback: Continue()

  • Pingback: Linkbuilding: moeilijk of toch niet? | Social Media, SEO, SEA, Website Conversie()

  • Pingback: Linkbuilding: moeilijk, of toch niet? /  mediafeed.gertimmer.nl()

  • Pingback: Concepts in IT Linkbuilding: moeilijk, of toch niet? - Concepts in IT()

  • Wimbo

    Hi All, we received a penalty for unnatural links to our site. After a few months of hard work and two request for reconsideration, we received a message that the penalty has been lifted, Although the penalty is officially revoked, we still see the Manual Action in GWT. Does anyone have any experience with this? How much time before this disappears? Does anyone have any experience with how much time it took them before the actual message that the penalty has been revoked and the time the rankings were (more or less) back to its old positions? So do we have to wait days, weeks or possibly months.

  • Spook SEO

    After going through your detailed post or your experience with disavow tool, now I am pretty sure that the tool really works .The only need is to better deeply observe all the backlink and then use the tool after detecting right problem.

  • Pingback: SearchCap: The Day In Search, December 23, 2013()

  • Pingback: Best of 2013: No14 – How We Got Our Penalty Revoked Using the Disavow Tool #bestof2013stod()

  • Pingback: SearchCap: The Day In Search, December 23, 2013 « Ecig Canadian Top-Rated Canadian Electronic Cigarettes()

  • Pingback: 3 Ways You Might Be Using Google’s Disavow Tool Incorrectly | U Team()

  • http://www.test.com what is google authorship

    Wonderful blog! I found it while browsing on Yahoo News. Do you have any tips
    on how to get listed in Yahoo News? I’ve been trying for a while but
    I never seem to get there! Many thanks

  • studiumcirclus

    In the case of not getting all the link-data you need from GWMT exports, you can actually use Screaming Frog to fill in the blanks.

    Upload the list of link sources to Screaming Frog in list mode and filter by pages which contain a link to your client’s site, or reference their domain / URL.

    You can then export the links which Screaming Frog found and VLookup all the anchor text, link destinations etc.

91 Flares Twitter 44 Facebook 3 Google+ 39 LinkedIn 4 Buffer 1 Email -- StumbleUpon 0 Pin It Share 0 Filament.io 91 Flares ×