Microformat and Datafeed Optimisation – A4U Expo London 2010
Estimated reading time: 3 minutes, 54 seconds
Panel – Richard Baxter of SEOGadget, Jonathan Stewart of ReviewCentre and Will Critchlow of Distilled.
Richard steps into Jon Myer’s shoes to help moderate as well as speak, so starts with the intro:
Information types are understood by humans in a certain way. Address formats, time and date formats are instantly recognisable to us. Not so for machines. Micro Formats, therefore, mark up code in such a way to make such data types instantly recognisable as that type – to be thus understood and presented in a more useful way.
Rich shows us an example of marked up html using hCard – which is used for understanding address data.
hReview is used for marking up standard data conventions in review content – such as “star” or suchlike scoring; plus summary description and number of reviews for product or event/location.
hCalendar – used to markup properties of date and time information
hCard (can also be combined with hResume) a la linkedin, will mark-up the information that might be understood to be the professional synopsis of an individual, e.g. name, location, connections.
Ensure you use Google Webmaster support information regarding implementing microformats, plus there is also a rather helpful validation/preview tool.
Also – Richard has put a lot of effort into building up a great deal of content all about structured data formats on the SEOGadget blog which I’d recommend checking out.
Review pages account for 75% of their traffic, and they rolled out hReview on different types of reviews, in order to take a staged approach. They started with “Item” stages.
Submitted in October, heard back in November (problem with the markup) corrected and resubmitted December and finally showing up on January 30th, after six weeks of little to no contact with Google.
Traffic effect 14% uplift.
Something to bear in mind if you have a large amount of indented listings, rolling out the number of pages with snippets, may not have the exponential uplift one might assume from small data-set tests, due to the secondary effect on 2nd indent listings.
Key learnings – don’t expect much help from Google; do expect to see a significant traffic increase, do expect to see a shift in your traffic patterns.
Final speaker of the session is Will (not Tom) Critchlow – who will be speaking about difficuties with affiliate sites and data feeds.
Will starts by talking about that it seems Google may not favour affiliate sites; which seems to be linked to content and uniqueness of said content. Which of course can be a bit of a problem if the bulk of your content is provided by a merchant data feed.
Will shows us a graph of a client who came to Distilled following their traffic essentially bombing. In the analysis process it seemed most likely that the lack of any unique content was the root problem. With the addition of some rather simple, brief, but unique content (as a test), the test pages began to recover. Therefore the test showed that the lack of unique content was an issue.
So, how to generate more unique content?
Will talks about using Mechanical Turk etc, though is careful to point out that this isn’t recommended for user reviews. (Legal issues with paying people to write reviews), however comments and other UGC on your own site is fair game.
In addition Will also shows us a rather cool cheat sheet of different API’s, sources and other wise freely available content to mix-up. Other examples can be; external search queries, internal search queries, tags, testemonials, FAQ’s support emails etc.
Be careful with external search queries, as this can cause difficulties in those events of random ranking for a crazy multi-string, not really relevant query that occasionally happens.
Reviews – set a custom variable, should there be no review for an item, in order to differentiate, or use the ReviewCentre approach to encourage reviews “we have no reviews… be the first” or words to that effect.
Context is key! Will says that when it comes to lyrics, essentially the duplicate content filters are off. So in that case, the content isn’t unique as such, but is unique versus competitors.
Clean up Your Data! – Don’t just take the spoon-fed data from the merchant, look to clean and align the data more carefully with what users are actually searching for.
Of course you can go the Mahalo road, and just get bucket loads of links, it may work, and has worked, and can work at times – but is this really a recommended scalable business model?