Best of 2013: No14 – How We Got Our Penalty Revoked Using the Disavow Tool #bestof2013stod
During the holiday weeks we will be showing you the 15 best read posts of 2013. Except for on Christmas and New Years day, each day you can read the best articles again, going from number 15 back to number 1.
Now it’s time for number 14, originally posted on February 7 2013, a guest post by Sander Tamaëla, one of the two guest posts in the top 15!
This is a guestpost by Sander Tamaëla, a freelance SEO Consultant.
In October 2012 a new client approached me with a question: Can you help me with trying to revoke a penalty? It happened to be that Google had just released their link disavow tool and this penalty was link based. In this post I want to share my three months of experience with using the disavow tool to get a penalty removed.
It all started off in early 2012, my client signed an SEO deal with a no cure no pay SEO agency. The problem with no cure no pay is how these companies seem to work:
• Try to gain rankings (yes, rankings is their choice metric)
• Fail to meet goals, but still want to get paid
• Bring out the less legitimate tactics without the client knowing the risks
• Get rankings
• Get paid
• Lose rankings
• Lose client
In April the spamming started to gain traction, traffic started to grow. It peaked in the second week of May, after that it got steadier. In August it somehow peaked again, then dropping and gradually gliding back to the traffic from before the spamming. I continued to drop a bit more to an all-time low of around 75 non-branded organic visitors. Please take in mind that it’s a small niche with a very high value per visitor.
I knew we could try to use the disavow tool, but I wanted to take it slow on what links and the number of links we would disavow. Next to that I wanted to know if this tool is of any use by itself without doing what Google preaches: first contact all webmasters before using the disavow tool to remove the remaining links.
First reconsideration request
We started of with the data from Google Webmaster Tools. We checked all ~6000 links on any of the following points:
- Are they editorial?
- Are they in the comments?
- Do they look paid?
- If they don’t; is there a sign that points to a way to buy a link?
- Do they look inserted after the first publish date of the post?
After double-checking we uploaded the first file to the disavow tool. Then we waited for about two weeks. (We did a reconsideration request after every disavow.)
The first rejection came the 26th of November, the default reply stating that the site is still violating the Google guidelines (in Dutch):
Second reconsideration request
Hmmm, we must have missed something, right? We downloaded the links from GWT again and checked (with vlookup in Excel) if there where any new links. There were, only as much as five, but there were. Marked them as spam, combined them with the old list and uploaded a new file to the disavow tool. We immediately did a second reconsideration request and the waiting for a new reply started.
Around two weeks later we received the same reply: your site is violating Google’s guidelines. Please check the links.
With getting only the default replies from Google we weren’t getting any further. I was wondering what could cause this problem and ended up with three possible options.
Time between uploading the links and doing a reconsideration request
Maybe there should be more time between uploading the links to the disavow tool and doing a reconsideration request. The systems could be out of sync somehow or maybe Google just wants you to take your time.
Really? Google developing a system that would take them extra hours to check/sync or whatever without automating it? No way, this needed to work out of the box. Giving users a way to generate a flood load of new data without automating wouldn’t make sense.
The disavow tool not working?
My client was convinced that the disavow tool was just not working. Next to that other SEO’s were stating that this tool was just a way of Google gaining more knowledge about spammy links and use it to generate a better way to create a new fingerprint for identifying new spam.
I was (and still am) convinced this would not be the case. User generated fingerprints could not be that useful, in the first place because of the noise and in the second place because it would give users a new way to influence the algo in a wrong way.
But what if Google sees new links appearing now and then despite you’ve stopped your spammy link scheme? New URL’s on the same websites that contain the same (duplicate) content or comments with your spam links in it. Sounds plausible to me.
The problem: data
Data. This must be the problem. So we did download all backlink data we could: Ahrefs and Majestic SEO teamed up with GWT to collect the largest dataset we could get our hands on (Opensiteexplorer.org was of no additional value in this case).
Again Excel comes in handy: using vlookup to identify unique links and domains across different datasets. The problem here is the limited extra data that GWT gives you: none. I wanted to know the anchor text of the links that were only in the dataset of GWT. To get my hands on this data I developed a small tool that crawls the unique GWT links to get the anchor text of the links pointing to my client’s site. We were then able to group links by anchor text and saw one keyword that was used for over 40%. This was the keyword that was hit the hardest, although first to second page wasn’t that shocking. But this keyword was stuck at the same position, on-page or changes or links wouldn’t make it move up or at least not for longer than a day. It kept dropping back to the same position.
Editorial Note: the tools Sander discusses here are in pre beta, but he was willing to share them:
domains-to-ip/ Turns domains into an IP adress and Class C subnets. With this tool you can transfer all your domains from GWT to IP adresses and Class C subnets and thus filter them in Excel. It automatically filters duplicate entries from the input domains.
to-anchor-text/ You input a list of URL’s of which you have backlinks, secondly you input the target URL. It will return the anchor text, very handy to add to GWT data (especially because it is different data than for example Majestic/Ahrefs)
Next I wanted to know if we could identify more domains to disavow. We gathered all IP addresses of all domains and grouped them by IP and Class C. The results weren’t that shocking, the IP’s seem to be quite random; the same goes for Class C subnets. Despite this I wanted to know if there where more spam domains on the IP addresses. Bing has this ip: operator that returns sites running on this IP. Although this was a hell of a job, I wanted to know for sure that there weren’t other tactics in place that would ruin all our efforts. The good: I couldn’t find any other sites that I would classify as spam domains. The bad: It was a lot of extra work.
So now we had a complete list of around 7500 backlinks and more than 600 IP addresses.
Using the domain: operator
The next step is critical and a bit of a less known option of the disavow tool, the domain: operator.
This operator gives you the option to not only disavow a single link, but disavow a complete domain. Why is this useful? Because Google keeps crawling the domains with unnatural links and keeps discovering the same link on different (duplicate) pages. If there wasn’t an option to disavow a domain then things would take a lot longer than the three months we spent recovering from the penalty.
Between the first and second reconsideration requests, Google found a bunch of “new” links, despite that the spamming stopped months a go.
A small word of warning: You should never blindly disavow all domains! This could be a huge mistake if you ever want a legitimate editorial link from that domain. It would be worthless. So think before you act, even if your client is pushing your buttons and trying to force your hand. Remind your client that if they want to grow their conversions and traffic from Google’s organic listings this step should be double-checked, at the very least.
We were in the lucky position where the spammed domains where off-topic and mostly in another language, other than the native language of my client site. So picking the domains to completely disavow was pretty easy. If you’re not in that position you should always ask yourself if a link of that particular website could be of any value in the future.
After all of the above we ended up with a list of ~500 domains and a couple of separate links to disavow.
The final reconsideration request
Despite the common belief that you can only do so many reconsideration requests in a certain amount of time, we did 3 in a short period. Why do I think this isn’t a problem? Because with each submitted reconsideration request we did a detailed explanation of the steps we took. We were honest, showed that we really did do some effort to fix the problem; we even named the two parties responsible for the spam.
Next to that you should convince Google why they should trust you. What did you do to prevent similar action from taking place in the future? Tell all of your story, but don’t make things up.
Finally, don’t let the SEO/marketing guy do the reconsideration request, ask the highest responsible person to sign the form you’re sending (in this case the client director did).
We submitted the final reconsideration request in the beginning of January and got the final words 3.5 weeks later (in Dutch):
The message above tells us:
- The penalty has been removed
- It wasn’t an algo penalty but a manual action against the site
- It could take some time to see the changes in the search results
- There might be other factors that negatively influence the rankings
- Google thanks us for all the effort we did to ensure the quality of the search results
What did the spammed keyword do in ranking? At first it was dramatic: second to fourth page over complete 3 months period. But within a couple of days we saw some recovery, it now ranks around place 15. I’ve advised my client to keep a low profile for the next month, but I think the road is paved to build more traffic in the future.
TL;DR, here are the learnings:
- The disavow tool works if you use it in the right way
- Blindly uploading all data from GWT into the disavow tool might not help you at all because of missing or new data
- Using the domain: operator is very useful, but be careful not to over use it
- Using the domain: operator is not always an option if you spammed sites of which you do want a link in the future.
- Be patience; take your time to analyze. Don’t rush in; it might make your life harder if you want to grow traffic in the future.
- Keep in mind that recovery is possible.
- It’s all about data. The larger the dataset, the bigger the change of recovery.
- I don’t think you need to take manual actions toward removing the bad links.
- In this case the penalty wasn’t that major, but the keyword was stuck at that position. Don’t think you can link build your way out of this.
- Google is smart. If you take the Panda/Penguin release cycle and the effort and time it takes to recover, it scares away the wannabe grey/blackhatter. I’m still wondering how many times you can screw up and recover, although I’m not testing this 😉
- Doing link spam to take larger chunks of traffic for a longer period of time might not be that feasible.
- Try to learn from each step you take. Other clients might profit from your well documented experience.
What grey/black SEO in particular might learn from this
If you’re into grey/black and don’t mind burning a domain you could still make large amounts of cash by focusing on seasonality. You can still rank long enough to rank high during the holiday season. Just calculate the risk and don’t put all you eggs in one basket.
Where Google might take the disavow tool
If I were Google I would take the disavow tool to the next step.
Sandboxing after successful recovery
What if you would receive a message that the penalty has been lifted but you where sandboxed for a month (or two) before any new efforts would gain any traction? You would be happy that your penalty has been lifted. But Google has the advantage: they can update there algo and release a new Panda/Penguin and penalize your site again while it’s in the sandbox. You wouldn’t benefit from all the effort you put in to your recovery.
Limit the number of domains to disavow
Although using the domain: operator might be a risk by itself, it is a powerful option to use. If you don’t care about the ability to get a link that does count from a domain in the future, you can use the disavow tool with the domain operator on your full link profile and then continue to build links from other sources that are of equal low quality. You can burn a domain for at least a second time.
But what if you are limited to, lets say, ~200 domains per disavow/month? It would take you more time and effort to recover and, most important, it would be harder to recover if you screwed up big time. Next to that, scalability: scaling the number of people helping you to recover is of no use because you still have to wait.
Devalue domains that are disavowed a lot
Google say they don’t do this already, but I think they should. They already devalue a lot more links than they used to do, why not devalue domains on a larger scale? Of course they should take other non-user generated signals and combine them with the disavow list, but the idea is pretty clear if you ask me.
What’s your experience with using the disavow tool?
Although I’ve seen all of the above work out just fine, I am aware that there might be other penalties that can’t get lifted the way I just described. What did you try and did it work? If it didn’t, do you know why? Love to hear more examples/experiences with the disavow tool: drop them in the comments! I’m willing to learn more, but also help you if I can.
I’ll try to keep you posted on this particular case.