11 Technical SEO Tools you Should be Using

[adrotate block=”5″]

As an SEO, I use a wide range of tools to help me with my job.  There are loads of tools out there and being honest, I often forget about some of the ones I have available to me.  Today I wanted to show you the contents of my bookmarks when it comes to technical SEO.   I’ll talk about the tool itself and highlight some SEO tasks that it can help with.

I will be talking about XML Sitemap Validator, Bulk HTTP header response checker, W3C Internationalization Checker, Web Page Speed Test, SEO Toolkit for IIS, Built With, Schema Creator, Reverse IP lookup, Spy on Web, Check Websites on same IP C Class and Screaming Frog.

1. XML Sitemap Validator


What it does:  Checks your XML sitemaps for broken URLs.

Why this matters:  Having broken links in your XML sitemap can cause the search engines to lose trust.  Bing confirmed this in an interview over on Stone Temple last year.  The key thing to remember is that all URLs in your sitemap should return a status 200 header response.  You do not want to see any redirects, 404s or 500s.

Where: http://ipullrank.com/tools/map-broker/

2. Bulk HTTP header response checker


What it does:  Allows you to check the HTTP header codes of a bunch of URLs at the same time.  You can also compare the responses for different user agents such as Googlebot

Why this matters:  Checking header response codes is important because you need to keep an eye out for any responses that are not 200.  These are commonly things like 404, 500, 301 and 302.  Sometimes there are good reasons for 301s and 302s, but you’ll want to keep an eye on them and fix them if appropriate anyway.  Checkout this guide to learn more about the different ypes of HTTP responses.

Where:  http://www.tomanthony.co.uk/tools/bulk-http-header-compare/

3. W3C Internationalization Checker


What it does:  You can enter your URL and the tool will look for elements of the page that indicate whether it is targeted to a specific language or country.

Why this matters:  If you are targeting a specific country or language, you need to make sure you are sending all the right signals from a markup perspective.  There are some subtle signals that can help to send signals to the search engines of what country you are targeting.  It can also highlight where you may be accidently targeting the wrong variation of a language such as US english instead of GB english.

Where: http://validator.w3.org/i18n-checker/

4. Web Page Speed Test


What it does:  Lets you compare the speed of your website against your competitors.  The output is a video which shows each site loading side by side.  You can also change the location of the test server to take account of where servers are located.

Why it matters:  Whilst site speed matters a little from an SEO point of view, to me it is even more important for users.  Particularly if you are an eCommerce website, so you should make sure that your website loads as quickly as possible.  Using a tool like this one can be quite powerful as it gives a great visual indicator of how fast your site really is.  Showing this to your bosses or to developers can be a great way of getting buy in for site speed improvements.

Where:  http://www.webpagetest.org/

5. SEO Toolkit for IIS


What it does:  A more appropriate question in the case of the SEO toolkit is what doesn’t it do!  There are loads of things it can do, but the main thing that I tend to use it for is crawling large websites and gathering data about each URL.  The crawl can be incredibly powerful, especially if you crawl from the cloud.

Why it matters:  When doing a technical SEO audit of any site, you should start with a deep crawl of the site to gather as much data as you can.  For example you’ll want to gather header response codes, META data and URL structures.  This gives you a great starting point and can often highlight a bunch of technical SEO issues very quickly.

Where:  http://www.iis.net/download/seotoolkit

6. Built With


What it does:  Tells you lots of information about what it sitting behind a website.  For example what framework it is built on, what CMS is uses, what analytics package it uses etc.

Why it matters:  It can be useful to know this kind of information because there can sometimes be technical SEO problems caused by the software / platforms used.  For example, there are known issues with using a .NET framework as well as using WordPress.  So being aware of these can help you spot problems pretty quickly.

Where:  http://builtwith.com/

Extra tip:  Check out the Built With trends page for some interesting stats on the usage of various web technologies.

7. Schema Creator


What it does: A really easy interface to generate the correct Schema.org markup for your content.  You can input your content such as a person or organisation, then copy and paste the output onto your website.

Why it matters:  Use of Schema.org vocabulary in search results is growing and you need to take advantage of it to give the search engines more information about your content.  This tool makes this process painless and super easy.

Where:  http://schema-creator.org/

8. Reverse IP lookup


What it does:  Allows you to input a domain and returns a list of other domains that are hosted on the same server.  You can also enter an IP address.

Why it matters:  There are a few uses for this one.  One of which from a technical perspective is seeing what other domains are on the same server as your own site – if any of these sites are spammy or have very bad link profiles, it is possible for Google to associate your website with them.

Something to note here – many shared hosting packages could have hundreds of sites on the same server which are all controlled by different owners.  So be careful here and don’t automatically assume that sites on the same server are controlled by the same people.

Where:  http://www.yougetsignal.com/tools/web-sites-on-web-server/

9. Spy on Web


What it does:  Spyonweb also does a reverse IP lookup as outlined above.  But additionally, it allows you to check for websites that are using the same Google Analytics, Google Webmaster Tools and Google Adsense accounts.

Why it matters:  There are occasions where someone else may be using the same Google Analytics code as you.  This happens more often that you’d imagine.  This tool allows you to find websites that are using your Google Analytics code quickly.  It also helps when trying to identify link networks who leave foot prints such as the same Adsense code.

Where:  http://spyonweb.com/

10. Check Websites on same IP C Class


What it does:  You can enter a list of domains and see very easily which ones are hosted on the same C Class IP.

Why it matters:  This is a tool more for technical link analysis than on-page SEO, but it is really useful so I wanted to include it.  If you are doing link profile analysis, you can upload a list of domains and easily find foot prints of sites that are hosted on the same C Class IP.

Where:  http://www.ip-report.com/ 

11. Screaming Frog


What it does:  A very powerful crawler which will return a bunch of info about a set of URLs such as response code, META data, rel=canonical tags and duplicate content.

Why it matters:  This should be part of any SEO’s toolkit.  Doing technical site audits with Screaming Frog is really easy, it is similar to the SEO toolkit I mentioned above but I’ve included it for two main reasons.  Firstly it is very powerful but super easy to setup – SEO toolkit can take a bit of work to setup.  Secondly, you can use it to crawl lists of URLs which are across loads of different domains, this can be useful for link profile analysis for example.

Where:  http://www.screamingfrog.co.uk/seo-spider/

There you have it, I’d love to hear what other tools you guys use, feel free to leave a comment and let us know or ping me on Twitter.

[adrotate block=”5″]

(originally posted May 10 2012)

About Paddy Moogan

Paddy Moogan is co-founder of Aira, a digital marketing and web development agency based in the UK. He is also the author of “The Link Building Book”.

56 thoughts on “11 Technical SEO Tools you Should be Using

  1. Useful post. It’s maybe worth mentioning that if you’re in iGaming – the w3c doesn’t give a feck about your you [or your visitors] – in that  many of their tools won’t work on live gaming sites [eg validators]. Also, I can’t help noticing the lack of mention of JS and non-JS accessibility to data – once of the biggest technical fails I see is sites which don’t work when JS is disabled [apple.com was notorious for this] .. Any Cool automatic checkers for this stuff?

    1. Don’t know of any automated tools for checking JavaScript but I tend to look at the text-only cached version of a page and also disabling JavaScript in Firefox Developers tools and browsing the site that way. I think a lot of the time it’s best to look in person at these things anyway. Tools help but no tool alone can make a good audit.

      Also, great list Paddy! Some I’d not used and I do a lot of audits 🙂 Misses off perhaps my most used tool though which is HTTPFox. Bulk header checkers serve a purpose but they often miss chained redirects as well as other forms of redirect. It also shows up a hell of a lot of information that goes on in the background.

      1. Thanks Richard!

        Agree that HTTPFox is a great little tool.  I guess I tend to use that less often than the other bulk checking tools.  I mainly use it if something weird appears to be going on rather than using it on every page.

    2. Thanks for the comment.  In hindsight, yes I could have mentioned some tools for checking JS and non-JS stuff – I probably left these out as I do this via toolbars in Firefox (which I’m trying to move away from!) and manually checking as Richard points out below.

      When I do check for JS issues, I tend to open a few browser windows and compare browsing with JS turned off (using the Web Developer Toolbar) and also comparing against the text cache of the page.

      1.  Hi Paddy, you could also check out A1 Website Analyzer

        Some quick tutorial videos:

        Custom Search Entire Websites for Text and Code Using A1 Website Analyzer

        Find Duplicate Website URLs and Page Titles Using A1 Website Analyzer

        Analyze Internal Link Juice Flow with A1 Website Analyzer

        Find and Solve Broken Links and Redirects With A1 Website Analyzer

    1. Thanks for the comment Michael.

      Do you recommend any tools that are a bit more advanced that I should have included if someone isn’t brand new to the industry?  More than happy to have extra feedback.

  2. Not really sure what sitemap generators out there actually would include URLs not returning response 200 in XMl sitemaps?

    At least A1 Sitemap Generator (but I assume this is common to all tools) will default to leave out duplicate URLs (“example/” and “example/index.html” with the latter being the duplicate in most cases), URLs that point to other with canonical, HTTP redirect, meta refresh redirect and URLs that somehow error, e.g. “404 – not found”)

    Sure, the sitemapper tools may *show* you URLs that error after scan, but if it actually includes them in generated sitemaps then… Just seems to me that it is double work to run a sitemap validator unless you are creating sitemaps manually?

    1. Thanks for the comment.

      Isn’t it possible that you generate a sitemap which contains URLs which all are 200s, then for whatever reason become a 404 after?  Some sitemap generators will update on a very regular basis, but many won’t  Some may update weekly for example, I’ve seen some that simply do not update automatically.  

      You are right in that when a sitemap is generated automatically, it should only generate 200s – however if you have a broken page for whatever reason, surely it is a good thing to spot it?  Sure you can spot it other ways, but this is another tool in the arsenal.

      1. That is possible of course. But why not just schedule/run the sitemap generator more often then? (Then you would also be able to see where the broken URLs were linked, redirected etc. from)

        If the argument is that it saves server resources, I can half-way understand it. But only so if it restricts itself only to checking HTTP response codes (i.e. use HEAD requests.) On the other hand, if it also checks for existence of e.g. canonical information, it will need to download the pages (GET) and thus no resources will be spared.

        1. But why not just schedule/run the sitemap generator more often then? ”

          Therein lies the problem.  I know it may sound silly but this can be a hard thing to get done if we do not have direct control of the website in question.  Some massive websites will also struggle to run a sitemap generator more often – although a way around it could be your idea to just check the header.

          Btw, I agree with you overall – I just still believe this is a useful tool to use 🙂

  3. The Microsoft tool can do a complete offline copy of a website, so you can search through the source code. Very handy if you are looking for pieces of code like analytics.

    1. I’m pro IIS and SF, but if you’re only checking analytics code (at least for presence of) you should be using screaming frog. It can support custom fields (that look for ex. GA code in the source) and will pluck out pages that have it. You can cross compare with the regular crawl and away you go.

      If you’re crawling a large site with IIS, storing local copies is pretty difficult. Unless you have some super computer with terabytes of RAM at your disposal?

      1.  Have you ever tried A1 Website Analyzer? It can do many of the same things (including custom search like GA, sculpt internal link juice by keeping track of all internal links + taking relative importance into consideration, show all broken URLs including from where linked/used/redirected, validate HTML/CSS, show duplicate titles and descriptions, put all the data into CSV files etc.)

  4. Thanks for that list, some tools are new to me.
    I recently made a presentation at work about SEO tools that I use in my job.
    You can fine slides here : http://goo.gl/ascDB

    If it could help some of you …

    Again thanks for sharing these tools.

  5. Especially the reverse IP lookup was a kick in the butt for me.

    10 of our strongest sites are all hosted on the same ip and it seems they are all going down a bit month after month. Mental note for next week; Switch to separate ip’s (if we still got some available) and maybe even put some names in different name servers.

  6. Awesome roundup! I was only familiar with both W3C Checker and Screaming Frog. The XML Sitemap Validator seems to be broken when I tried it, or was it just me? The Reverse IP Domain Check was an eye-opener. Thanks for bringing these tools up!

  7. Fabulous Tools set I could’t live without this thing. Website Analysis or auditing tools post will be appreciate. 

    Thanks A Million.

    1. Thanks for your comment Murat, the feedback is good to have.

      I didn’t have beginners or advanced people in mind when I wrote the post, I just wanted to share a bunch of tools which I find useful.

      Can I ask what tools you use which you feel do have value?  Would be good to get your ideas 🙂

  8. Great list! Couple of things to mention – worth noting that ScreamingFrog’s Spider can crawl sitemaps and return HTTP status; and that Bosma’s Excel SEO Tools are great for collecting data in bulk (such as HTTP status, PR checks, XPath scrapes and the list goes on!) and its free! Link:  http://nielsbosma.se/projects/seotools/functions/

    Once again – great list though! 🙂

  9. Awesome 11 tools  of Seo I was only familiar with both Built With NET framework as well as using WordPress.  So being aware of these can help you spot problems pretty quickly.and Screaming Frog.SEO Toolkit for IISor was it just me? nice.


  10. Hi Paddy Moogan,
    Thanks for sharing this nice post.
    Some of the other SEO tools are Keyword Suggestion Tool,Keyword List Generator,Keyword List Cleaner,Meta Tag Generator,Keyword Density Analyzer,Ad group generator etc.


    Thanks again

  11. This was really helpful post, i will try too use some of these tools and will get back to you if got any query, Thanks Paddy

  12. Thanks a lot for the great list of tools Paddy. I do think that “Chrome Sniffer” a chrome add on for knowing the tools used to develop a site, pingdom tools & Gtmetrix for testing page load time of a website are also the technical seo tools that one can have in their arsenal.

  13. Thanks for the great list Paddy 🙂 I know one more tool which is great for checking position and gather lots of information about SEO statistics – Colibritool.
    It has GA integration, competition monitoring, detailed backlink
    checker, traffic statistics, conversion measurements and more. It works
    really good and we didn’t have problem with it. Also their support works
    quick which is very important thing.

  14. I noticed you listed only free tools. This is good 🙂 I suggest you also add seoeta.com. There are a lot of free tools like seo analysis, seo grader. If you have access to open site explorer you can create a backlinks profile using the exported data from OSE.

  15. Nice post! I haven’t used the IIS toolkit yet but looks like I should probably download it! I also use the web dev plugin for Firefox for easily disabling javascript/CSS etc 🙂

  16. Great list. Never hear of screaming frog before. Also i would add the google rich snippets testing tool to this list which invaluable if you are working with schema markup.

  17. Hi Paddy, this is such a nice post for me as a few tools you have written here like the Web Page Speed Test and the SEO Toolkit for IIS are new for me and these are looking interesting. There are so many other tools but these are the best one and most of the SEO professionals are using these tools and the better I like is the Screaming frog.

Comments are closed.