SEO is what it is because of the great tools available, giving us deeper insights, fresh ideas and the ability to back up everything we say with hard evidence. Here, I’ve listed a few of my very favourite tools, as well as the unique benefits that each brings to the table.
Whilst I’m sure that you’ll have come across a few of these tools before if you work in SEO, if this post inspires you to add another one or two to your repertoire, I’ll consider it a success!
Searchmetrics is one of the most popular SEO analysis platforms for a reason. With the largest dataset available from any SEO tools, it’s the perfect place to find a wealth of information about how a site’s performing in search.
One of Searchmetrics’ most handy features is the “SEO Visibility” score, a nominal and proprietary metric which can give an overview of a domain’s performance overall, both on mobile or on desktop. This is derived from the search value of your existing rankings, and is useful particularly when assessing a site’s performance against a competitor.
The “Traffic Value” score is also a useful one – estimating the worth of your organic keywords if you were to pay for them through Adwords – as is the “Position Spread” graph within the “Rankings” tab. This shows where the bulk of a site’s most valuable rankings are (in terms of position in search results). By exporting the “long tail” keywords from this view, you can also find big potential wins, by identifying “nearly there” rankings, which could generate big rises in traffic if bumped up the rankings successfully.
When it comes to content ideation and competitor analysis, there are few better and easier to use tools than BuzzSumo. The platform identifies the top performing articles in a particular area, or for a specific site or topic in terms of social shares.
What makes the tool so great is the simple view it provides of online social performance, allowing you to both find clues as to what people are finding interesting, and what popular posts have in common. The tool is also useful to introspectively look at your own content, allowing you to quickly assess what’s performing well and what may benefit from a re-think.
The Screaming Frog SEO Spider allows you to scrape an entire website, or a list of URLs or search engine results page, and is invaluable for too many reasons to list here. One crucial way, though, is allowing you to note URL response codes in bulk. By first crawling a site, then exporting results to excel, you can simply and easily note all pages performing 301 or 302 redirects, as well as those with 404 error codes.
Looking at the volume of links with these response codes and altering them accordingly is an essential way to keep a site ticking over, and benefits both users and search engine performance by creating a quicker, easier to navigate and less frustrating site. You can also crawl an XML sitemap on Screaming frog, ensuring that it isn’t directing search engines to any unnecessary redirects or 404s either.
In order to check the quality and quantity of links to a site, Majestic is the ideal tool. Giving a time-specific view of incoming links, it’s the perfect way to spot a dodgy backlink profile, as well as to see where the most valuable links are coming from.
Majestic lets you review the number of overall links to a site, compared to the amount of domains they’ve come from. What’s more, you can see the link anchor text, and backlink breakdown. This can give invaluable clues as to whether links may have been bought or manipulated in the past, and whether these are benefiting, or indeed harming, a site.
Majestic also gives a handy snapshot of a site’s link performance with their CitationFlow/Trust Flow graph. This helps you to visualise the likely quality of links to a site, both in terms of link equity (essentially power) likely trustworthiness (by virtue of their closeness to known, trusted pages).
Google Site: Search and Data Miner Chrome Extension
Firstly, by performing a simple site:search on Google, you can quickly assess how many pages on a site have been indexed. Comparing this to the number of pages listed on a sitemap, or available through a Screaming Frog crawl can give a number of clues as to a site’s performance, or likely issues such as index wastage.
However, where the Data Miner chrome extension comes into this, the real magic happens. The tool is able to scrape the contents of search engine results pages, meaning (whilst a site:search gives you a number of pages), a clever use of data miner can see you export all of the URLs indexed on Google. Whilst the search engine will no longer let your scrape all of these in one go, clever searching (for example by one category at a time) can allow you to attain virtually a full list of indexed URLs.
This helps you to spot site areas which are failing to be indexed, as well as those which have unintentionally been added to the index (such as private site areas, and certain user generated content or parameters).
If you’re looking for useful competitor keyword data, SEMRush is a great place to start. Whilst the tool’s data-set is slightly smaller than that of Searchmetrics (around 200 million keywords as opposed to 600 million), SEMRush will take a snapshot of all of the top 20 pages ranking for terms, and – seeing as so few users pass further than this in their Google searches – does give an accurate picture of keyword rankings.
For a fresh insight into keyword rankings, and those shared between a site and its key competitors, within the main dashboard select “Domain vs. Domain” under “Domain Analytics”. Once you’ve input the sites you want to compare, click “enable graphs” for a handy Venn diagram of keyword overlap, and you can also easily export results to review both sites’ performance over valuable shared keywords.
I hope here to have been able to give a few new ideas for tools to use in your SEO strategy. For any questions on using the tools covered here, or for more information on any of the processes covered, I’d be glad to help so just drop a comment below.