BrightonSEO: Lessons Learnt from Technical SEO: Site Performance and Crawlability
Events

BrightonSEO: Lessons Learnt from Technical SEO: Site Performance and Crawlability

29th April 2014

State of Digital has summarised a series of technical SEO lessons gleaned from a selection of BrightonSEO speakers. From sitemaps, bots and crawlability to SEO audits and site migration, we cover the presentations of Dawn Anderson, Bastian Grimm, Pete Handley and Kate Dreyer.

Lesson 1: Good sitemaps = friend. Bad sitemaps = SEO death

“SEO death” may sound dramatic but those are the words Dawn Anderson found when researching excessive numbers of indexed URLs. Dawn shared her experience in managing and optimising a trade’s people website, which originally started as a hobby but became her full time job. As part of on-going optimisation, Dawn wanted to add an additional dimension to “explode” organic search for longer tail keywords: sitemaps. So, Dawn’s team keenly set about creating and submitting 60 sitemaps with 50,000 URLs each.

The number of indexed pages quickly rose from 400,000 to 1.5 million pages. But organic rank started slipping and ultimately only 0.1% of pages indexed were being crawled each day. As organic traffic continued to plummet, Dawn’s team uncovered an infinite loop in the sitemap that was squandering crawl budget. With such limited crawl budget available, it started affecting the organic rank of associated pages.

So how do you fix this? Dawn’s advice was to always test sitemaps to identify any issues, such as infinite loops, as well as find out where Googlebot goes by checking server logs in Google Webmaster Tools. Another tip was to regularly resubmit sitemaps, as this helps drive home to Google the website structure. Naming and categorising sitemaps – e.g. Shoes.sitemap.xml, Tshirts.sitemap.xml, etc. – allows for a more granular but also a bird’s eye view to help identify issues across a site. Dawn recalled the SEO website architecture mantra of “flat and fat”, whereby valuable content should be accessible and not hidden deep within the site, as this will reduce equity.

As crawl budget is related to PageRank, Dawn finished by tabling a theory from A J Kohn of Blind Five Year Old that pages which are crawled more often seem to rank better. So, if you are able to effectively herd Googlebot with well-defined robots.txt, nofollow, sitemaps, nav paths and cross module internal linking, etc, could you ultimately increase PageRank of pages?

Lesson 2: Bots and users – don’t waste their time!

Slow…

loading…

pages…

Frustrating for humans and boring for bots. Bastian Grimm gave a wealth of various tips, methods and tools to increase site speed, as slow page loads negatively affect a website’s organic rank. Bastian quoted an Amazon case study which stated that every 100ms latency in site load speed resulted in a loss of 1% revenue. Profitability may be a more convincing business case than user experience for investing in site speed, so Bastian set about outlining how to achieve this.

The first step: how fast is your site? Below is a list of some of the tools Bastian referenced, although you can find them in the full presentation here.

Tools to measure site speed:

  • SiteSpeed
  • GTmetrics.com
  • YSlow – Bastian did stress this is a browser and so depends on speed of connection, so it’s not as accurate as web based tools
  • Google Analytics average page-load time – However, this is taken from sampled data so take the data with a pinch of salt.

Now you know the site speed, how do you make your site fast? Bastian said the key lies in streamlining your site’s code, as reducing its length will help improve site speed. From removing unnecessary whitespaces, link-breaks and comments can help. Similarly, compressing images and reducing file size can make a big difference.

Methods and tools to measure minify:

  • JPEGmini and TinyPNG to reduce image size
  • HeadJS  which reduces the load speed of JS and CSS SpriteMe –– it takes all small images and then gives you the CSS code for them as one big image which can then be used instead
  • Minify CSS and compressors
  • GZIP tool
  • Kraken.io – Free online image optimizer
  • Robotto – tool to allow you to monitor website changes e.g. when sites go down, or when robots.txt changed etc.

Lesson 3: SEO audit checklist

Pete Handley gave a summary of the 6 steps of an SEO audit. You can see Pete’s presentation here but the steps of the audit are elaborated upon below:

1)        Site crawling:

  • Tools to crawl sites:

o    ScreamingFrog SEO Spider

o    Rob Hammond’s free crawler

o    Beam us up

o    Linkdex

  • Robots.txt: Check the robots.txt to understand where Googlebot is going. Are there issues with no disallow all or IP restrictions? These are all simple things that can easily sneak into robots.txt, so check and check again and make everything as crawlable as you can (taking on board what Dawn said earlier about crawl budget!)
  • Site:www.example.com: Another method is to make use of Google site: search command, as you can discover the content that is being indexed, as well as errors, find orphaned pages or themes of content that you do not want indexed.
  • Google Webmaster Tools and Bing Webmaster Tools: look for crawl errors in webmaster tools, and if you’re not using Bing then you really should.

2)        Integrity:

  • Broken links: Once you have all URLs from the crawl, check for any errors and present these back to the web development team then and there. There can often be increases in rank by just resolving a myriad of 404s and incorrectly used 301s. Using advanced export in Screaming Frog can help you pin point these.
  • Canonisation: Duplicate content issues can arise from folder URL consistencies with trailing slashes as well as www and non-www versions all rending and presenting multiple pages with the same content to Google. If you do have content which is nearly duplicated, then build out content on those pages so they are unique, otherwise 301 redirect them to a relevant page higher up within the site architecture.
  • Site speed:  As we heard from Bastian Grimm, poor site speed is a negative ranking factor. In Pete’s experience if there are real problems with site speed then that will inhibit organic rank. Removing any junk code will help with this.
  • Outbound links: Following the advent of Penguin, it’s necessary to review all outbound links – you can receive manual penalties for outbound links as well! Trust all the sources you’re linking out to.
  • Errors: Check all errors in Google Webmaster Tools and across devices.
  • Tracked pages: If you put the Google Analytics code into the spider filter, you can see what pages are being tracked.
  • Reduce code: Remove junk source code – still best practice as it slows page load speed and increases file size.

3)        De-duping:

  • Scraped content: Tools like Copyscape can reveal dupe domains or scraped content, as well as exact match search.
  • Duplicate domains: Reverse IP look ups can help check if there are any other client sites that they own. Do you know who else your client shares a server with? Definitely worth checking.
  • Titles, descriptions, H1s: Simple but effective. Replacing duplicated tags with unique ones can help give a boost to rankings.
  • Pagination issues: Use rel=”prev” and rel=”next” to clear up relationship between pages or canonise to view all pages.

4)        Optimisation:

  • Mapping keywords to pages: If you can’t clearly identify a single keyword associated with a page then you may want to reconsider the page’s content or purpose. A keyword to page map is also a good record to fall back on.
  • Review internal linking and information architecture: Involving SEO teams early into the web development process is far more effective than trying to fix the processes later down the line.
  • URL structure: Clean URLs which can be user friendly are ideal but also consider potential duplicate content issues that may arise from the use of filter system etc.
  • Titles, descriptions and headings: Instead of focusing on keywords, try to write a sentence that appeals to users as this can improve Click Through Rates.
  • H1: Include the keyword and communicate clearly to visitors what the page’s content is about.
  • Alt tagging for images: still important to mark these up, particularly with image search.

5)        Refine:

  • Rich snippets and benefits of schema
  • Shareability – pick the right ones for your audience
  • Href lang – set up GWT targeting to be country specific

6)        Repeat:

Repeat and go back again! Keep going and then again.

Lesson 4: Facing the challenges of global site migration

Kate Dreyer tackled the subject of best practice for global site migration, giving a broad overview of the process. One of the first steps is domain strategy and Kate stressed that once you decide on the format then you should stick with it. Whatever the reason to migrate sites, whether redesigning or changing CMS, domain strategy should underpin everything. Consider what works best for the target market, whether it be sub directories, ccTLDs or sub domains.

The next step in the planning phase is to consider how to manage the whole process. Two tools that were recommended by Kate to help manage site migrations were Jira and Confluence, as these can ticket tasks, document progress and assign deadlines. If either of these tools are unavailable to you, then Basecamp can be a suitable alternative.

Following this, think about URL structure and keywords. Be aware that if you are translating keywords, never use a direct translation as words may have different meanings in different languages and cultures. It is always safer to use a native speaker for translations for more relevant and appropriate language. Additionally, the search volume may be lower in other countries for keywords so it’s important to check that you are targeting the best term for that country.

Well constructed XML sitemaps and the correct use of hreflang tags are also an important step. But before you do anything with a site migration, make sure you block the new site in the robots.txt file. Once you are ready to go live, check everything is in working order before you unblock the new site and then finally block the old site. Simple but sage advice.

Post launch, keep a keen eye on Google Webmaster Tools for errors, as well as Google Analytics. However, one of the most time consuming tasks can be going back through the site’s link profile and getting links to point to the new domain. Although 301 redirects can be used, the most value is gained from getting the link directly pointing to your site, so it’s best to contact webmasters and get the links changed.

Speaker bios:

Dawn Anderson  @dawnieando

The Director of Move It Marketing, Dawn’s entry into the world of SEO was via a less trodden path. She retrained as an SEO web developer in 2008 after having worked as a Director in the service industry for 15 years. Dawn’s passion lies in technical SEO and, following a successful stint working for established media and digital marketing agencies, is now devoted to her own projects and continues to offer freelance SEO consultancy.

Bastian Grimm @basgr

Managing Partner and founder of Grimm Digital, Bastian Grimm has experience as a software developer and is now involved in international SEO as a digital marketing consultant.

Presentation: http://www.slideshare.net/bastiangrimm/the-need-for-speed-brightonseo-2014

Pete Handley   @ismepete

As Technical Director at the digital marketing agency @TheMediaFlow, Pete Handley creates and delivers technical SEO strategies whilst supporting the agency’s growth and development. As passionate about karaoke as he is about digital marketing, you may well discover a YouTube video of his singing skills ahead of his SEO audit checklist.

Presentation: http://www.themediaflow.com/2014/04/technical-checklist/

Kate Dreyer

Global SEO manager for Education First, Kate Dreyer has over 4 years experience in SEO, content marketing and social media. You can read more about her digital marketing escapes at http://www.katedreyer.co.uk.

Presentation: http://www.katedreyer.co.uk/2014/04/25/brightonseo-april-2014-global-site-migration-monster/ 

 


 

About the author

Briony GunsonBriony Gunson is an SEO Client Manager at Resolution Media, part of Manning Gottlieb OMD, a London based Media Agency.

Having previously worked across PPC, SEO and Social, Briony is passionate about integrated SEM strategies and handles accounts for a wide range of UK clients. She enjoys the challenge of working with multiple teams, agencies and stakeholders to develop holistic digital strategies and is always looking for ways to improve and tailor processes, relationships and practices.

When not burning the midnight oil at work, she’s tearing about on a netball court, cycling to dance classes or bopping about at a gig. You can come say hello @BrionyGunson

Tags

Written By
This post was written by an author who is not a regular contributor to State of Digital. See all the other regular State of Digital authors here. Opinions expressed in the article are those of the contributor and not necessarily those of State of Digital.
  • This field is for validation purposes and should be left unchanged.