#SEOscience Snake Oil vs Solid #CreativeSEO
Search Engine Optimisation

#SEOscience Snake Oil vs Solid #CreativeSEO

28th January 2015

Bah Humbug to #SEOScience. It is no great mystery that these #CreativeSEO bets will pay off.

There is a habit pervading the SEO industry to test all the anecdotal minutiae, rumors from Google, and random “insights” from discussions with Google spokespeople. Findings get reported under the auspices of #SEOScience, but amount to little more than some impassioned SEO tail chasing. Upon which all the mountebanks show up to compare their brand of “Snake Oil.”

Snake Oil
No one knows what goes into this or if it does anything useful.

The problem with trying to “prove” SEO axioms is that doing so generates spammy experiments that don’t really tell us anything new and tend to just confirm, unscientifically, what we already suspect. Moreover, it tells us nothing about what kinds of SEO work will stand the test of time. It is common sense that our tiny corner of the internet is not a good representation of what is actually going on online — you know, on the actual internet.

Nonetheless, that doesn’t stop us talking about what works, even if they’re just anecdotes. And most  SEO science experiments are closer to lay science — which, as Michael Martinez writes, also has its benefits.

My creative SEO thesis is that SEO science is completely missing the point by testing the wrong things and sharing results about its circular findings. I agree with Martinez that real SEO science would share data. If you’re not sharing data, it’s all just hearsay. But shared data is actually what’s missing from SEO science more than anything – and it’s continual absence leaves an entire industry open to the charge of “selling Snake Oil.” And that is plain unfair on all of us.

Even if you think that most SEO science is engaged in pointless posturing like I do (because apparently you only need to tell a more sensational SEO story than the Google Guidelines to earn the title SEO rockstar/data prophet/marketing ninja/social media guru), we can still generate value for our readers by laying down the mantle of science and reporting the anecdotes of our experiments more honestly: as anecdotes.

I’ve crafted this post as proof. This is what my own data says. This is some of what I know about SEO right now. And it’s all just anecdotes.

Below are a series of obvious creative SEO bets that are all aligned with what’s written in Google’s Guidelines. Each is accompanied by an example of the kind of measurable results you could generate. I’ve written a companion post to this one and I’m deliberately using hashtags in this post to mark both sides of the debate, and I hope you will join in this important industry discussion on Twitter. If we widen our debate to include hashtags, we can empower all voices in the industry and diffuse obstructive partisanship.

My hope is that the highlights below will embolden you to bet on the “unproven” and anecdotal effectiveness of #creativeSEO: a nuanced way of thinking that follows the spirit of the Google Guidelines.

Google says, “Make sure that your title elements and ALT attributes are descriptive and accurate.”

Changing the Title tag can yield a ranking benefit in just one month.

The only mystery here is to guess what keywords an information seeker might use.

Targeting Search Traffic with SEO
This graph shows the difference in estimated search traffic (according to Google AdWords) gained in one month from an improvement in rankings, just by changing the website title tags.

It will also cause a rank shift in the back of the index.

It works because you just did what the Google Guidelines told you to do.

Targeting Search Traffic with SEO
This graph shows the difference in estimated search traffic (according to Google AdWords) gained in two months, just by changing the website title tags. But the important thing to note is that the gain of top 30 and top 100 positions indicates that there is significant potential to see even more improvements.

Which also delivers an indexing benefit.

Are you getting the idea now?

Webmaster Tools Data
This graph shows the actual gain in exposure (according to Webmaster Tools) in two months just by changing the website title tags.

And a search traffic benefit.

It really does. Cool, huh?

Organic Search Traffic Over Time
This graph plots the effect of a content strategy on organic search traffic over time.

But it still might take a seasonal dip.

You can’t win ’em all, buddy!

Seasonality in Webmaster Tools Data
This graph shows a drop in search visibility in general (according to Webmaster Tools), probably due to seasonality, but equally may be an indicator of something unknown at this stage.

Google says, “Create a useful, information-rich site, and write pages that clearly and accurately describe your content.”

Fresh content can deliver huge instant wins.

One single initiative generated as much traffic and as many links to the site as the entire site had garnered in an entire year. Yet, that it happened at all was a mixture of luck and experience. It was simply a great gamble.

Performance of a Single Story Over Hours
This graph shows the interest in a single breaking news story over hours.

But a single story can reap rewards much later than expected.

Sometimes, the work you do will pay off later. Just because it didn’t work the first time around doesn’t mean you can’t revisit the strategy and creatively rethink your approach.

Performance of a Single Story Over Time
This graph shows the gradually increasing interest in a single article feature months after it was published.

And your archive can be optimized too.

Your archive is one of your greatest assets. Every original piece of content on your site is like putting money in the bank that earns interest over time. It doesn’t take a genius to do the math and turn this insight into a long-term growth strategy. It just takes some lateral thinking.

Accrued Interest on Published Posts by Month of Publishing
This graph shows the gradually increasing and long term interest in all content over months.

Untangling complicated websites can boost search traffic.

At the risk of boosting one domain while sacrificing another, consolidating domains can yield a net positive result. But changing the current structure of a website always entails the risk that some unknown ranking factor gets lost and topples the whole website from the top of the search results.

Untangling Complicated Websites
This graph shows the total contribution of lots of different subdomains and acquired domains to a single business. It’s messy data that reflects a disorganized site structure that needs to be fixed!

#SEOscience could cogitate for ages on the problem of reconciling multiple domains for a single business, but the discussion will be theoretical at best when, at the end of the day, someone has to do it to a deadline.

The risk has to be taken one way or another. So use your methodologies (and the Google Guidelines) in order to structure your work so you can untangle or catch a mistake before it happens. And again, it’s not science. If anything, such a methodical approach is part of the #creativeSEO process.

Consolidating Domains Yield Net Benefit
This graph shows the impact of consolidating all those disorganized domains into a single unified business strategy, now comprised of just two domains that no longer cannibalize each other. One domain suffered a loss but at a huge gain to the other.

Migrating your blog to your core domain probably will boost search traffic:

Both your blog and site can generate traffic independently, but simply moving your blog content on a subdomain to a folder on the core domain can yield a search boost to all of your content. If there is any single SEO strategy that we could get scientific about, it’s probably this one. The effect of changing this one variable in the correct manner (i.e. with the right redirects and migration strategy) is one of the few things we could actually agree upon and easily share data about.

Impact of Migrating Blog to Core Domain
This graph shows the difference in organic search traffic to the same pages, before and after migrating a blog subdomain to the core domain.

Google says, “Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.”

Click-through rate is mostly a factor of position.

If you have Adwords experience, you probably know this already. But it also holds true for SEO. Clickthrough rate (CTR) is a stronger indicator of how people use search engine interfaces than of how relevant your website is in the search results.

With so many keyword term and phrase adjustments for every intended search query, it’s almost absurd to associate the idea of absolute relevance to any single website. That doesn’t mean clickthrough rate is a useless metric, but it does mean it can only indicate so much — and what it will nearly always indicate is that being higher on the page is the most important place to be.

So, forget ‘ranking factors’ and just incorporate your understanding of that search user behavior into your SEO strategy.

CTR Plotted by First Page Organic Rankings (Webmaster Tools)
This graph plots the CTR per keyword ranking on the first page of organic search results (according Webmaster Tools). The interesting thing to note here is how CTR drastically increases with top of page positions while the bulk of lower first page positions roughly perform the same. It’s a “winner takes all” distribution.

Topic targeting may be as important as keyword rankings but no extra secret sauce is required.

The very latest secret sauce advice for SEO in 2015 makes a big deal of the Knowledge Graph and over-eggs the claim that everything is about “entities” or “topics.” There would be nothing wrong with this advice except that most of the cases made for entity optimization are made under the auspice of improving rankings on specific terms.

Which is the wrong way of thinking about the problem that Google is trying to solve.

The entire entity-based data set, namely Knowledge Graph, is designed for Google to cater to those classes of search queries that cannot be encapsulated in one single keyword expression of the idea. Ontological connections are deliberately disparate and vague so Google can offer the user a best guess at what might be relevant to the user and should be. You could equally imagine every sighting of a Knowledge Graph result as Google shrugging and saying, “I’m not sure whether you want to know more or less on this topic,” and then deriving the intention of the query (say, whether it is informational rather than a transactional query) from the ontologically related links the user clicks next.

Therefore, any theory on entity optimization that purports to go beyond schema/markup is basically table stakes for anyone who has a grounding in the basic SEO principle of targeting search traffic. It’s no big whoop.

Targeted Search Terms Vs Actual Landing Queries
This graph shows that there are at least double the number (if not 10x in some cases) of similar-but-different variations on the high volume queries you are specifically targeting with SEO, landing on your website. This means you can almost always count on more exposure, not less! And the clickthrough rates on these less competitive long tail variations tend to be higher even though individually, they have a lot less volume. Yet taken in aggregate, the number of times they’re searched, add up to or exceed the volume of the specific keywords competitive keywords you are targeting. And they drive higher engagement!

Google says, “Make pages primarily for users, not for search engines.”

A lot of your best SEO decisions can come from just trying to get better analytics.

When you really design for the user, you have to design useful tools for yourself. Or put another way, you can’t design for the user without real user data. The graph below was created to tell the webmaster which content categories different audiences were most interested in.

Category Interest by Source (Via Onload Events in Google Analytics)
This graph shows what category of articles gets the most page loads according to the traffic source (via Onload Events in Google Analytics)

As you can see in this case, all audiences are consistently interested in SEO which is good as this is data from an SEO industry publication. Equally, it is fairly unsurprising that all the visitors from social networks were most interested in social media stories. Also, that visitors from Google Search were most interested in companies was not a huge surprise either because the content in that category was essentially all news and optimized for company names.

However, there is one big surprise: audiences that arrived from Linkedin were disproportionately interested in PPC, which highlights an opportunity to engage them in a more specific and strategic way.

With the data set up in this way, one could start to try some little experiments to change the pattern. But the primary skill required to get to this point is just learning about how to set up your analytics correctly (sorry, no secret sauce required).

Bounce rate is adjustable (and pretty meaningless in aggregate).

The graph above just maps onload events in Google Analytics to specific content categories. And just this process of adding the onload events cut the average bounce rate to a fraction of what it was, simply because we’re now dictating that every page load is deliberate.

Business owners and marketers continually misread bounce rates because the measurement itself assumes a lack of user interaction. To make this metric useful you need to tweak your analytics setup and deliberately ascribe some kind of purpose to pages and events on your site (the irony here is that you have to make the events/actions that are almost too trivial and too obvious to measure be the ones that count most). Otherwise, you’re just measuring what you already know, which is that most users will bounce (and no one needs to prove that).

But for the sake of a golden rule, I’ll advise that you “stop watching bounce rates at an entire site level.” In general, your “website bounce rate” really doesn’t matter unless you’re looking at a specific page. If you insist on worrying about bounce rates, then consider them only as part of an analysis of individual pages or segmented sets of pages on your website.

Bounce Rate by Page Type
This graph shows how segmenting bounce rate by the page type yields a much more useful analysis that seems to reflect more strongly on user behavior than a sitewide analysis and is more aligned with what you would expect a user to do – in this case, we could argue that site visitors really are finding what they want on the blog.

And bounce rates increase with intentionality of the page.

This just means that people are more likely to bounce from the page the less relevant the page is to their current information needs. This is true regardless of where the page originally ranked in the search result pages. But that behavior alone points to the inherent difficulty for any SEO to claim to understand how relevance really is at play beyond the “luck of the draw.”

Bounce Rate by Page Type
This graph shows how segmenting bounce rate yet further by page type yields a more useful analysis, yet raises more questions than it answers. Are site visitors leaving bouncing from the pricing pages because it is an awful page or were they just not interested in buying in the first place? Perhaps that they looked at the pricing pages at all is a win.

And bounce rates increase with intentionality of the medium.

So if they didn’t come from search engines, they probably shouldn’t be included in your SEO analysis — or your overall marketing analysis for that matter. You have to segment your audience if you want the bounce rate metric in Google Analytics to tell you anything worthwhile at all.

Bounce Rate by Source to All Pages
This graph shows how segmenting bounce rate on all pages by source suggests that visitors from search engines and social networks display markedly different behavior.

So can we stop advising people to worry about bounce rates as a ranking factor? Bouncing users are inevitable and whatever it signifies to Google will be benchmarked on a completely different scale — such as with the page layout update — that takes the entire design of the internet into account (which probably found that sites cluttered with ads caused more users to bounce at a search level).

So, clicking the back button is just a fact of life.

To drive the point home once and for all, bounce rate is just a measurement, not a signal. It can only identify behavior that already exists. It is a useful indicator, but not a useful predictor of your ultimate SEO objectives. Sure, measured correctly, bounce rates can be effectively evaluate your business objectives and help you design better user experiences — which we know can ultimately yield a benefit on search engines. But to focus solely on bounce rates is to fixate on the finger when it’s clearly pointing to the moon.

Bounce Rate by Source to Pricing Pages
This graph shows how segmenting bounce rate yet further by source suggests that users from different sources display quite different behavior.

Buzz Lightyear says, “Power Laws, Power Laws Everywhere”

In my opinion, nearly all the things we can fairly test in SEO will nearly always produce a power law distribution. Everything about Google Search and how it presents the entire question of relevancy has to lead to this kind of “winner takes all” result.

Buzz Lightyear on Power Laws
This is Buzz Lightyear talking about statistical patterns. Thanks for reading this far – you RULE!

From our own experience, we all know whether something is relevant or not. But what our own experience can never tell us is just how fragile the concept of relevance is from moment to moment at “Google Scale.” We can’t even count how many times the question of what’s relevant comes up for ourselves in day-to-day life, let alone in the eyes of billions of users.

As we are truly unable to infer the inner workings of the information seeker input (the keyword queries), we can only make assumptions about it based on their output (the ten blue links they click). Which means, in general, we’re only testing the design of a search interface and have gotten no closer to defining the descriptor we tout as “relevance.”

Ultimately, for Google, once you move beyond the backstory of PageRank, there really is no “algorithm” without clicks. It’s just a giant sample testing machine. It’s a weather vane that points to what might be relevant, yet once the results are displayed it’s down to the user to choose. Search engines and social networks might yearn for the day when the “ten blue links” design we see in search results pages are a thing of the past, but that day will only come if the new user interface proves to be the one most universally adopted by users.

Search engines are currently designed in such a way that there can be no clicks without lists. And anything displayed as a list implies an order or set of priorities — even in an unordered list. That habit for introjecting an implied order is automatic and caused by the simple fact of how we read. We read lists from top to bottom, which incites certain behaviors that have nothing to do with relevance.

That such behavior massively influences any concept of relevance should not be perpetually discounted!

In fact, I would argue it is just about the only thing the SEO industry can sensibly test without actually being Google.

Tags

Written By
Jonathan Allen is the President of Longneck & Thunderfoot, a brand publishing company which is part of the Columbia Startup Lab, an incubator program based in New York City. He is also formerly the Director of Search Engine Watch (2009-2013), one of the longest running news publications on the search...
  • This field is for validation purposes and should be left unchanged.