Away with the copycats: Google anti-spam algorithm change launched

Away with the copycats: Google anti-spam algorithm change launched

31st January 2011

One of the most active departments within Google is the anti-spam department, run by Matt Cutts. Even though it seems to be very difficult to ban all the spam from the search results, they are trying hard.

Just before the weekend Matt Cutts announced that changes to the algorithm which should “help drive spam levels even lower” are now live. The changes were announced on the Google blog ten days ago.

In the original post Google acknowledged that it seemed as if Google’s results had become worse if you looked at all the stories out there. In fact Google stated that “according to the evaluation metrics that we’ve refined over more than a decade, Google’s search quality is better than it has ever been in terms of relevance, freshness and comprehensiveness“. English-language spam in Google’s results is supposedly less than half what it was five years ago. In other words: it has gotten better.

Now some might argue that statement, but Google didn’t stop there. In that post (a bit tucked away), which was mostly saying how good a job Google is doing in webspam-control, they also announced that they were “evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.

That change has now been approved and launched. That means that in some queries you might see different results than before. The changes however will be going unnoticed by many. On in about 2% of the queries and in less than half a percent of the search results you will be able to see changes.

The changes are mainly aimed at content farmes. These are low quality sites which copy content from other sites to get traffic on them and get clicks on their ads. It’s another move by Google trying to get a grip on these sites. The problem usually however is that not just these sites are affected, other sites can feel the harm too. In a Webmasterworld forum thread several site owners already noted they saw drops in traffic on their sites.

Here’s the issue with this: many sites could have partly original content on them and partly take content from other sites. That could be an introduction text or something else. If these sites get “hit” by Google that might not always be right. Without a doubt we’ll be seeing more complaints about this popping up in the near future.

Written By
Bas van den Beld is an award winning Digital Marketing consultant, trainer and speaker. He is the founder of State of Digital and helps companies develop solid marketing strategies.
  • This field is for validation purposes and should be left unchanged.