Editors note: this is Bryant Dunivan’s first official post as State of Search blogger. He did a guest post before but will be on as a regular from now on. Welcome Bryant!
With Google’s recent algorithm change, many webmasters will be wondering how it will affect them in both the short and long term. This update sees webpages being judged based on how easy it is for the searcher to find the content that is highlighted on the search engine results page (SERP).
What is most interesting about this update is that it is estimated to play a role in only 1% of searches. We do know two things about Google’s estimates though: 1) They are sometimes wrong, and 2) What starts small can turn into a big portion of your SEO strategy.
Short Term Effect
Basically, if you hide your content in a ton of ads, you are in danger of Google singling you out in this update. The most common type of site I think of is the annoying song lyric site. We have all seen them, we search trying to find what an often-misquoted lyric is, searching for say [Elton John Rocket man lyrics], and we land on a page like this:
Now, we searched for lyrics – Google knows what we wanted, and gives us this page as the #2 result. Paying specific attention to Google’s statement, the goal is based on the complaint that, “Rather than scrolling down the page passed a slew of ads, users want to see content right away. So sites that don’t have much content ‘above-the-fold’ can be effected by this change.” So here we have it, the lyrics we were so desperately looking for are below the fold, and we have to scroll past a banner ad, a skyscraper ad, and a flash ad to get to the 16-20 lines of song lyrics.
Now, will Google take this particular site out of the top 10 because of this algorithm update? No one knows for sure, but this kind of ad use is exactly what the algorithm update is looking for. If a user has to scroll past all of your ads to read about their answer, you are in danger of conflicting with this change.
How will Google implement this change?
This is the more difficult question to answer – how will the algorithm quantify an arguably subjective factor into a piece of a formula. I think in the short term, we will see this algorithm applied to only the most serious of violations. As we have seen before, just because Google says we have a small change does not mean it stays that way. Moving forward, I have two ideas for the implementation of this update:
Option 1: Text Ad Code Detection
The first method of ad detection is simple in form, but requires a strict variable – the ads must be served using AdSense or a similar text based ad source. AdSense, for those unfamiliar, is Google’s ad serving answer for publishers. If you have seen the box of text-based ads that look similar to Google search results, you know what I am talking about. You may also notice AdChoice ads rampant on many sites, AdChoices advertising is a cross platform-advertising indicator, that tells an individual who served the ad and that it was based on their interest:
Now Google preprograms this AdSense block for webmasters. The AdSense display will have nuances that are the same across the web. Google could use a ratio or some other signifier to determine how many AdSense boxes there are in relation to actual content and determine what kind of user experience the page in itself will create. The same holds true for ad providers that use Ad Choices, the webmaster inserts these preprogrammed codes into a web-site template. The code shows ads dynamic to each user based on their interests. AdChoice provided ads might also be found out by displaying the logo. It is a standardized logo with a multitude of advertisers taking part in it, so that in and of itself may become a detection signal for GoogleBot.
There are some issues with this kind of detection. It is only applicable to ad mediums that Google has looked at and decided to add to their discovery definition. It will not be helpful for sites that sell their own advertisements. Also, this may decrease the natural crawl rate of a given site, as now GoogleBot must look for certain content variables and return cues that it otherwise would not have to.
It would be helpful though for a site that returns as relevant in a given search but is literally peppered with text-based advertisements. Moving forward, Google can learn how to clean up the SERP, eliminating pages powered by text based ads and create an ad detection algorithm as both ads and the web evolve. In the future, if the development of this algorithm continues it may then be able to eliminate these types of pages from the top 10, in this instance [how to fix a broken ipad screen]
Option 2: Bounce Related Detection
Now the more likely use of Google’s time will be simply to detect that a user clicks on a result from the SERP and either immediately or quickly hit their back button and select a new result from the list of 10 given. This could be similar to how Google used to (and will again) give users the ability to block a domain from appearing in their search results. In the initial announcement of the algorithm update, it was said that they have received complaints about giving content that is hidden as a result on the SERP, but they do not say how the complaints have been received.
It seems reasonable to presume that in some ways, the blocking of a domain across the web could come into play here. When Google initially announced the ability to block domains</> from your result, they did say that they would look at the data in order to streamline search results, and if true would be yet another indicator on how some seemingly minor aspect of search could become a big factor in the future.
How bad can it get?
The biggest fear that should arise from this algorithm update is from those webmasters who are currently using jQuery or something similar to hide content for space reasons or utilizing a rotating header. It should be noted we are not talking about Flash here, because that poses its own problem in SEO.
Take for example this query [what is ken anderson’s tna status]. It returns a result from a specific wrestling news site, which shows in it’s description an exact response to this query
When you go to the page, you are brought to a home page and notice a rotating banner, and will not find the snippet that matched your search so well anywhere on the page. In order to see the answer to your particular question, you have to wait almost a minute before the appropriate slide comes up on the page.
Now I am not saying that this site is going to be decimated in a future iteration of the algorithm update, but this site and other sites like it may want to explore how useable that kind of navigation is for their users. If you currently use this type of navigation, you might want to ask yourself a few questions in reviewing it:
- Is the navigation intuitive?
- Would an average user know to click on a particular item to move on to the content they are looking for?
- Is there more than one way to get the answer on your page for a given search?
- Can YOU find the content on your site?
- Could your grandparents (or another non web-savvy substitute) find what they are looking for?
If you can answer yes to all, you should be all good. If not, and you may require some streamlining in the future if the algorithm progresses in order to make it easy for users to locate the content on a given page.
Google tends to build on itself, and while this may be what the algorithm does in the future, it may do something entirely different. What we do know is that right now, Google is focusing on how users are able to get to the content they see in the SERP. They will look at how ads work with or against the page and make a conclusion based on that. If you clutter your pages with ads and tend to place your content below the fold – It may be time to re do your page template and serve the content up front and above the fold.
Going forward, watch out – just as Caffeine gave way to Panda and Freshness, this particular algorithm update may turn into an animal all on its own.