Screaming Frog Log File Analyser – The New Must-Have SEO Tool

Screaming Frog Log File AnalyserLog file analysis has been making a comeback in recent years in technical SEO, as more and more SEOs realise the power of delving in to a website’s server log files. This excellent guide to log file analysis from BuiltVisible is a great place to learn more about it.

Personally I have to admit I didn’t pay a lot of attention to log files for many years. Early in my career, log files were all we had when it came to analysing a website’s performance, and we used tools like AW Stats to serve as rudimentary web analytics. When JavaScript-based analytics arrived on the scene, log files were quickly abandoned and we focused on the new, shiny reports that tools like Urchin and Omniture could produce. Along the way, we forgot how log files could be incredibly useful for other purposes.

This forgotten knowledge is now being rediscovered. Some old school SEOs never stopped looking at log files, and I reckon these people are more than slightly bemused by the renewed attention on log files as a source of SEO insight.

As I’ve been delving in to server log files once again these last couple of years, I’ve been relying on tools like Apache Log Viewer to extract useful insight from server log files. Now, however, we have a new tool at our disposal, one that is specifically designed from the ground up to help SEOs make use of log files.

This new tool is the Screaming Frog Log File Analyser – sibling to the well-known staple of any SEO’s toolbox, the Screaming Frog SEO Spider.

I’ve been fortunate enough to have tried the Log File Analyser in beta, and now that it’s publicly released I can’t wait to share with you some of the incredible uses this tool has.

Importing Log Files

First of all, you’ll need your website’s server log files. These are often stored on the webserver in the /logs/ or /access_logs/ folder, and you can use FTP to download these files on to your computer.

As server log files can get very large on high-traffic websites, it’s important to select an appropriate date range for your analysis. Log files covering too short a timeframe won’t give you much meaningful insight, but if you analyse too large a time period your analysis will be very slow and could suffer from overwhelming amounts of data.

To start with, I’d recommend taking log files from the most recent complete month and start analysing from there. You’ll quickly learn if this is sufficient data to work with, or if you’ll need more.

Importing these log files in to the Screaming Frog Log File Analyser is simple: just drag & drop the files in to the main window when you start the tool for the first time. The tool might ask you to confirm the root URL of the website you’re analysing, as this can’t always be extracted from the log files themselves.

Once the files are imported, you’ll get a dashboard overview like this one:

Screaming Frog Log File Analyser - dashboard

In this dashboard you can see at a glance which search bots crawl your site and how often, how many pages they crawl, and which HTTP status codes are returned. This is a useful overview of how well-crawled your website is.

You can then dig further in to the log file data. For example, you could identify what your slowest-responding URLs are, and which URLs result in 4xx Not Found errors. For the latter report, just go to the Response Codes tab and select the type of HTTP status code you want to look at:

Screaming Frog Log File Analyser - Response Codes

Similarly, you can see which URLs on your site have been crawled and how often. You can filter all data by search bot, so you could for example see exactly how often Googlebot-Mobile comes to your site, and which pages it crawls.

Screaming Frog Log File Analyser - select bot

The log files themselves already offer a wealth of information which can help you analyse the performance of your website.

Importing URL Data

This type of analysis is pretty cool, but doesn’t tell you much on its own. The real fun begins when you combine this log file data with other URL data.

For example, you could compare log file data to your XML sitemap, to see exactly how often your sitemap’s URLs are crawled by which search bot. With standard log file analysers, you’d have to export the log file data in to a spreadsheet and manually compare it to your sitemap’s URLs.

With the Screaming Frog Log File Analyser, the tool does the hard work for you. You can import URL data from a wide range of sources and combine it with the log file data, to generate all kinds of insightful reports. Once you’ve imported both log file data and URL data, the report allows you to show URLs that match – or don’t match – these two datasets:

Screaming Frog Log File Analyser - select URL data

Now the tool’s real value becomes evident. By comparing log file data with any set of URL data, you can analyse your website for a whole range of different issues:

  • Which pages on your site are being crawled most often? Are these the pages you want crawled most?
  • Which pages on your site are not being crawled at all – i.e. orphan pages?
  • Are all your XML sitemap URLs being crawled? If not, why not?
  • How frequently is your News Sitemap being checked by Googlebot?
  • How often are your paginated pages being crawled vs your main category pages?
  • When a page is changed, how long does it take from re-crawl to when the search index is updated?
  • How does a new inbound link impact on crawl rate?
  • How quickly is your newly launched site or site section being crawled?
  • Are crawlers spending inordinate amounts of time crawling URLs that add no SEO value?
  • etc…

The potential for analysis is almost unlimited. By comparing any set of URL data with your log files, you can gather meaningful insight on your website’s interaction with search spiders like Googlebot. This in turn allows you to spot issues with your website that regular SEO tools might not be able to easily identify.

Time Saver & Brain Rester

When Screaming Frog first released their SEO spider, it quickly became one of the SEO community’s favourite tools. While other crawl tools like Xenu existed prior to Screaming Frog’s, the wealth of features in the frog made life so much easier for professional SEOs. It has saved us incredible amounts of time we’d otherwise have to spend on manually analysing crawl data.

Now, with the Log File Analyser, the Screaming Frog folks have done it again. Once again they’ve created a tool that saves a lot of manual effort, thus making our life as SEOs so much easier. I’ve only been using the Log File Analyser for a few weeks, but already it’s become a standard go-to tool for any new website I analyse.

Without the log file analyser, log files are a cumbersome data source requiring a lot of Excel sheets and VLOOKUP formulas. With the analyser, my poor little brain gets a well-deserved rest, as the tool does most of the hard work for me. And for that I am infinitely grateful to the Screaming Frog team.

You can download the Screaming Frog Log File Analyser here.

About Barry Adams

Barry Adams is one of the chief editors of State of Digital and is an award-winning SEO consultant based in Belfast, delivering specialised SEO services to clients across Europe.

  • Jere Matlock

    Thanks so much for sharing this, Barry. Much appreciated. I count myself among those bemused old school SEOs wondering why people haven’t been looking at log files, and you’ve explained it nicely.

  • Nice article Barry, as always! By the way, do you think that tools prices will reduce in future?

    • Can’t see the price dropping anytime soon for either of these tools. For me Screaming Frog is an essential toolset and £198 per year for both these tools is very cost effective.