SES London Day 2 – SEM Power Tools Set: SEO & PPC #SESLon

SES London Day 2 – SEM Power Tools Set: SEO & PPC #SESLon

13th February 2014

SEM Power Tools - SESlon

With all the changes over the last 12 months, from Panda to Hummingbird updates, there is a need to refocus  both your PPC and SEO strategies to succeed with your marketing efforts. Jasmin Ahrens (Director – International Digital Acquisition, American Express) and Richard Baxter (@richardbaxter) take us through some of the best practical tools you will need to drill deeper than just keywords and help you to manage and succeed with your campaigns.

Jasmin starts the session based on PPC and SEO and how to approach selecting the right tools for your company.

Strategies for SEO have changed in terms of moving away from ranking on keywords and looking at the much wider long-tail search and the user.  Companies really need to think more about the audience when it comes to both SEO and PPC, rather than ranking number one for keywords, Jasmin suggest you take a more human approach to search, keeping in mind that customers now move across channels from both SEO to PPC, to get to your site. Think about:

  • Demographics data – genders
  • Secondary signals – location of visitor
  • Onsite behaviour – pageviews

Use this data to inform and shape your strategy so you have a much better understanding of the kind of customer you are focused on reaching. One of the most important factors is to choose a goal and make sure that your strategy is manageable.

Building your strategy

Any search tool that you use should deliver a baseline to enable you to monitor and measure, but at the same time consider:

  • What tool is also best for your industry, as certain tools are better suited to different industries
  • What funds and budget do you have – how complicated would your business case be
  • What channel is more important to you and needs most attention
  • Set your guidelines and objectives of what space you want to own – PPC or SEO
  • Can you use one off the shelf master tool approach
  • Reporting should be easy, quick and efficient
  • Make sure you can gain actionable data for campaigns
  • Have the right resource and “in-house” experience to use the tool
  • Make sure everything is tracked

Jasmin suggest some of the following tools that you can use to connect the dots, some of which she uses but are not exhaustive:

  • Analytics – Omniture, Webtrends, Google Analytics
  • Campaign data – Marin, Ignitionone, Doubleclick
  • Attribution – Marin, Ignitionone, Google Analytics Doubleclick
  • Advance Channel Data – Adthena, Brightedge, SEOmoz
  • Audience – Bluekai, tealeaf

When creating your too-lset, it’s a good idea to think about phasing in the tools themselves, it allows you the focus to shape what you want from it, how it works and if its right for your company before moving onto using more tools.

The tools are still nothing without people, having the right people employed, from channel experts, analysts, strategists and innovators will ultimately make your company use the data in the right way, because without the right people all you are doing is collecting data.

So in all, understand your SEM audience, set your goals and define your strategy, choose your SEM tool-set carefully, track everything and leverage data and lastly, don’t stop, make all your decisions based on data and be prepared for tomorrow as the face of search is changing constantly.
Richard Baxter is up next and uses his own SEOGadget as a case study for doing a link audit. SEOGadget recently had an unnatural link warning within Google Webmaster Tools, so they had to do an overall link audit in able to find what back-links were deemed a penalty and then go about submitting a “disavow” file into GWT in order to remove the penalty link.

Tools to use for an audit

  • MajesticSEO – it does a fresh crawl of their database every 90 days, and using their trust and citations section, you can easily see if there are any dubious links to your site
  • Ahrefs – they are really good at freshness, discovering new links fast as it is a great tool to do an assessment of any new links to your site and domain, you can quickly analyse that there are bad links coming to your site
  • Google Webmaster tools – is great for providing back-link data, Richard uses it to take the back-link data, exports it and uses it across other tools to verify any bad links and cross reference those that link to the site
  • SEO Profile – you can search by link context, urls, page title and source – the only drawback at the moment is the tool is fairly new but its depth of crawl is a little limited

Data consolidation

  • LinkRisk is a great tool for an agency or SEM working across lots of different domains, within LinkRisk you can connect GWT, MajesticSEO, Ahrefs so all the data is in one place and consolidating what links across all the tools are deemed unnatural or coming from a “risky” domain. You will be able to highlight the link quickly and have it added to your disavow file and submitted to Google.
  • SEOGadget for Excel – one of Richards own and good for getting to lots of link data quickly and via some nifty “calls”, you can quickly build a comprehensive data set and understand your linking profile quickly.

When using some of the tools that Richard suggest, you may find some domains with a high Page Rank linking to your site, check them out as on some occasions he has found that the Page Ranks being reported are not quite aligned and can come in as a higher Page Rank.

What to collect and what to look for in the data

The data and information you need to check to be informed about and understand your linking profile include:

  • Count of domain – site wide links
  • Links from penalised domains
  • Odd Page Rank distribution curve
  • Links with exact match keyword anchors
  • Links for pages with malicious/explicit content and directories
  • Lots of domains on the same C-block IP
  • Links pointing to 404 destination URL
  • Hidden do follow comment spam – under disqus plugin
  • Any non indexed domains that are not in Google index
  • The number of links back to your site with zero Page Rank – manually check and add to disavow
  • Add all directories into the disavow – you wont be penalised for a domain that may already have been penalised

The de-dupe test

Download all of the link data from GWT and then extract the domains, (you will find that across some of the tools mentioned here, the numbers of unique domains with links back to your site can vary) put all of the unique domains that you have de-duped from all of the tools, you will end up with a final de-dupe list – from here manually review the links in the data set. Richard says GWT is probably the best overall tool to extract in unique URLs for your site links.

Manually reviewing the links in your profile, allows you to understand what links back to your site, their worth and of course to see what the good links are. If you just add them all into disavow, then you run the risk of submitting a good back link for your site, which you will never get back and lose all Page Rank for that good link to your site.

In conclusion, make sure you use the right tools for SEM that consider budget, resource, your industry, the people who have to use them, the reporting and what your goals are. Try phasing in the tools, to be able to move from just analytics all the way through to attribution model toolsets.

When it comes to a link audit, use a cross section of tools, Richard offers that GWT is one of the best for reporting the link profile and once you are happy with the extraction of the URLs and domains linking back in, then its always good to manually dive in and dig deep taking a look at each domain and link, this will ultimately help you to understand your own domain back linking profile.


Written By
Russell O’Sullivan is an all-round Senior Digital Marketing Manager. With over 15 years of experience in the digital environment, he has worked across varied disciplines such Content Strategy/Marketing, PPC, SEO, Ecommerce, Social Media/Marketing, Web Design and UX.
  • This field is for validation purposes and should be left unchanged.