This was a great session where Patrick and Jon covered some very actionable insights into site and link audits and how to recover from search penalties.
Patrick Altoft, Branded3
Patrick started by discussing the future of SEO. The core of that is the question “do people love you?”. If people love your site then the link building becomes far easier than if people don’t. Google gets this data from Chrome and while they don’t use that for rankings they do use it for their AI algorithms about site quality. A good example of people love is John Lewis.
He then went on to talk about Home and Garden Gifts which is a pretty good site from an onsite perspective. It has 114 linking domains. SearchMetrics, however, shows two major dips for the client. One dip was Panda 2.2 the second big drop was due to link devaluations.
Their link profile looked ok to start with as there are a lot of brand inks and not much use of keyword links. What did happen though is that they did a lot of directory submissions to directories that only had SEO use which Patrick saw as the cause of their dips.
The next step for them is a content check. If you don’t have a great deal of content then your long tail rankings are likely to suffer. Most of the content on a product page was only 30 odd words which isn’t nearly enough. You need a few hundred words at least to avoid Panda. Duplicate content is also an issue and a quick check in Google will show up duplicate content. If it’s not original then you need to get that done if you want Google traffic.
You can use reviews like BazaarVoice which have been shown to increase search traffic if you don’t have a lot of traffic on your product pages. Questions and Answers should be integrated into the product pages and this can have a great effect on onsite content and conversion rate.
Moving forwards the site seems to be penalised twice and Patrick suggests the following action:
Jon Quinton, SEOgadget
Using the IIS crawl report is like using ScreamingFrog on speed. It can highlight huge amounts of unnecessary redirects.
Architecture problems are often the cause of major SEO headaches. Exporting the Xenu crawl by level shows the ratio of deeper content. In his example he showed that you can map it out better by continent to flatten the architecture. He would however keep the popular destinations on the home page .
Finding bad links is essential in a site audit. He uses all the major link software and combines them into a single XLS filtering out duplicates. The next step is to check which links aren’t live. He then separates the domain from the link URL. Use SEOTools for Excel to pull in the page rank for these domains. While page rank isn’t the best metric it’s good enough for a quick glance. Once done you can map out the quality of links to this domain into a page rank distribution chart. You can also highlight anchor text distribution into brand vs. non brand.
Building good links is obviously very important. The maps for Tripomatic are a main source of great content. Ensuring they can embed these maps is key. He then suggests scraping a list of opportunities using guest blog searches in a Google doc. He also suggests blogging within a subfolder which can then attract more links for the site. Features and interviews can deliver a good amount of links especially to self-promoting small businesses.