SES London Day 3 – Assess. Diagnose. Fix: How to Become a Leading SEO Mechanic #SESLon
Marketing

SES London Day 3 – Assess. Diagnose. Fix: How to Become a Leading SEO Mechanic #SESLon

18th February 2014

More coverage of day three of SES London 2014, today I’ll be covering “Assess. Diagnose. Fix: How to Become a Leading SEO Mechanic” by Andre Alpar, Partner at AKM3 GmbH and David Naylor, chief SEO at Bronco.

David

Whether an issue arises from indexing, crawling inefficiencies, design issues, or over-optimisation, you need to possess a diagnostic mindset to fix your SEO problems.

This session is dedicated to helping you learn the following:

  • Which areas of analytics are powerful eye-openers for portraying SEO issues.
  • Which tools work best for diagnosing SEO issues quickly.
  • What causes common SEO issues and how to remedy them.
  • How to create effective reporting to help monitor performance.

Andre started the presentation by talking about the importance of the crawling and indexing management.  These two elements are very important and are the foundation for achieving good rankings and effectiveness in SEO.

Crawling management

If a site owner/webmaster does not focus the crawlers, they will crawl everything they find. If the crawling management strategy takes care of less important URLs it increases the probability that important URLs will be crawled.

When laying out crawling strategy Andre prefers to think of websites as onions and their layers. Blocking less important URLs from crawling increases the probability that important URLs will be crawled.

Examples of pages or files that do not want to be crawled:

  • printable versions of pages
  • PDFs
  • small versions of images
  • URLs with parameters used for sorting or filters

Use robots.txt file and 301 redirects to steer the spiders. Some browser add-ons that can help determine if robots.txt is configured correctly and show which pages a web crawler may crawl:

  • Roboxt
  • Linkparser

Alternatively use crawlers which help webmasters/marketers analyse a website such as:

  • Strucr.com
  • Onpage.org

Indexing management

Indexing strategy focuses on which URLs are important for users alone and which for SEO as well. Ask yourself about how many URLs you want to ‘spread’ (distribute) the authority your domain has. (Andre used a funny analogy to authority distribution which was a bottle of vodka shared with many friends. If you share the bottle with too many friends, each will get a tiny bit and not even feel it. But if you share it with your best friends only, they all will feel it.)

How to find the right indexing strategy

First step in finding the right indexing strategy is to sorting through pages which are visible to your users. Some URLs are important for users & internal linking only (ex. cart pages, help section, pagination pages), others for SEO as well. (also maybe insufficient content quality). But many websites also need additional SEO landing pages for ‘translation’ purposes, as Andre puts it. These pages are a translation layer to accommodate how real people search.

Use “<meta robots=…>” for indexing management and define the following rules:

  • Pages for user + internal linking = noindex, follow
  • Pages for user + internal linking + SEO = index, follow

The last type of pages is where you need to focus all your SEO efforts, populate valuable content and promote in search results.

David continued the session by asking where one really starts in order to become a leading an SEO mechanic. It has to be Google Webmaster Tools (GWT).

Watch out for warning messages, common examples include:

  • Crawl errors
  • Manual actions
  • Server problems

Keyword data may be inaccurate. GWT tells gives the keywords and positions but doesn’t make clear that results are local, or allow users to see where the localisation is. David shared an example of Bronco’s GWT where the agency is #1 for ‘marketing agency’ but in truth it is only for Ripon, where the agency is based.

Backlinks

Total number of links displayed in GTW does not match the number  that is downloaded as a .csv. This is because the data is sampled, so keep downloading the latest links during a few days or weeks, spider latest links from 3rd party tools such as Majestic and Hrefs and perform a clean up or detox campaign based on all the data you can get. Make use of detox tools to help you determine which links are bad.

Once toxic links are discovered, put them straight into the disavow tool.

Businesses should now choose the pre-emptive strategy and deal with toxic links which may have been obtained in the past now. “It is a new world out there and bad links are just bad karma. Just get rid of them as soon as you can.’ says David. He feels that Google’s trend of going after bad links will only intensify.

Google’s message about backlinks is very clear – they are not going to accept low quality spam any more. Review your profile and add any sites you are not 100% happy with into your disavow.

Site speed

Google bot likes fast websites across both desktop and mobile devices. There are a number of tools out there which help analysing the loading behaviour on your website:

  • GTmetrix – generates scores for web pages and offers actionable recommendations on how to fix them.
  • Botify – a good crawler to extract metadata from a site and provide actionable recommendations on how to optimise your site.

Content

Copyscape is a great and free tool to help you monitor your content regularly for plagiarism.

You can also use Google search to find content using snippets from content from your content.

Keyword tracking

In David’s opinion rankings are still worth monitoring every day.

Tools:

  • Pro RankTracker
  • AWRCloud

Keyword takeaways

  • Manage how spiders crawl your website and help them find the most important pages
  • Monitor backlinks and deal with anything toxic as soon as you can.
  • Look after your content and deal with any plagiarism issues
  • And make GWT and other tools mentioned in this session your best friends.

Tags

Written By
Polly Pospelova is a passionate online marketing professional who thrives on delivering unique value-driven search solutions. As well as managing UNRVLD agency’s natural and paid search teams, Polly works closely with both technical developers and UX specialists to maximise customer experience, customer engagement and conversions.
  • This field is for validation purposes and should be left unchanged.