Clicky

X

Subscribe to our newsletter

Get the State of Digital Newsletter
Join an elite group of marketers receiving the best content in their mailbox
* = required field
Daily Updates

Saving the Environment with Schema.org

27 November 2013 BY

We’re all very aware of the advantages to implementing schema.org structured data on our websites. The extra semantic mark-up unlocks a host of special features in Google’s search results, primarily in the form of rich snippets for various different types of content.

From star-review ratings to product information, Google is increasingly using rich snippets to provide additional details about the webpage it is ranking, and that in turn yields stronger click-through rates from SERPs. So schema.org is a great tactic to increase traffic from Google search without having to improve your site’s rankings.

But what is Google’s incentive for introducting rich snippets for structured data? Why is the search engine so graciously rewarding websites that go that extra mile and implement schema.org?

The obvious answer that Google provides is that these rich snippets improve user experience on its SERPs. It’s the default explanation for any change that Google implements, and while it may have some merit in this instance it hardly makes for a convincing motivation. Some of the unintended but easily predicted side-effects of rich snippets for reviews, for example, has been an abundance of 4/5 star rich snippets in Google’s search results, effectively nullifying any advantage these sites may have gained from the review ratings:

Eenie, meenie, miny, moe

Eenie, meenie, miny, moe

Looking a bit deeper, it seems clear that schema.org markup makes things a bit easier for Google. When Google crawls the web and indexes the content it finds, it needs to do a lot of work. Every page its crawlers retrieve needs to be analysed, code separated from content, and the content parsed and evaluated for relevance. Google does this 24 hours a day, 7 days a week, all throughout the year, at a pace of millions of webpages every minute.

This crawling & indexing of websites is a very CPU-intensive process. Anyone who’s ever run a Screaming Frog crawl on a large website will have seen their computer slow down as the crawler works its way through the site. This is not because Screaming Frog is badly coded – quote the opposite – but because crawling and indexing websites is hard work and consumes a lot of your computer’s resources.

Scale this up by quite a few orders of magnitude, and you can see where this is heading.

With structured data you’re effectively sign-posting your content for Google’s indexing, enabling the search engine to very easily identify and parse your website’s content. With schema.org markup you’re taking away a lot of the CPU-intensive work Google has to do when analysing a website’s source code. This allows Google to make the whole indexing aspect of its search engine much more efficient, and that in turn means it can save Google a few CPU cycles for every webpage that has implemented structured data.

A saving of a few cycles per webpage doesn’t sound like much, but at the scale Google operates at it can amount to a lot of kilojoules. A fraction of a kilojoule times a few dozen billion webpages, and you end up with a considerable energy saving over the long run.

This is also one of the motivations for Google to push for faster websites, and why load speed has so often been hinted at being a ranking factor. Google’s crawlers struggle with slow loading websites, so the search engine wants make things just that little bit more energy efficient with faster crawling & more efficient indexing of websites.

One of Google’s data centres

This attempt at marginal energy savings which, on Google’s scale, add up to significant reductions in power consumption, are not done for entirely altruistic purposes. Yes it’ll mean Google’s data centres will have lower carbon emissions and that’s great for the environment and all, but it also saves Google quite a bundle in energy costs. Maximising shareholder value, one CPU cycle at a time.

Does all of the above sound a bit far-fetched? Does structured data and fast-loading websites really result in energy savings in Google’s data centres? The last energy usage figures released by Google in 2011 show the company has a rather significant power bill, and since we know the company’s engineers often exhibit an almost pathological attention to detail, I wouldn’t put it beyond them to leverage the SEO community this way.

AUTHORED BY:
h

Barry Adams is one of the editors of State of Digital and is a freelance SEO consultant based in Belfast, delivering specialised SEO services to clients across Europe.
  • Alan Charnock

    All these rants point to one thing Barry, you are a double agent for Google. (or we have not had an update for a week or so.) Either way good work.

  • Nick Wilsdon

    Like the theory Barry :-) This might also explain the push for responsive web design as their preferred option. Once they know a site is responsive, they throw a switch and stop sending all the alternative bots to crawl the domain. I imagine that saves them a fair number of CPU cycles.

  • Matt O’Toole

    I wouldn’t be surprised if you’re right, Barry. When organisations get this big, any tiny process improvements can yield significant bottom line savings, possibly enough to justify someone’s yearly salary for a few days’ work.

Watch our free webinar about blogging now!