Perform log file analysis with Screaming Frog

Many SEO specialists use Screaming Frog for crawls, but forget that the program is also excellent for log file analysis. It gives you insight into what search engines are really crawling your site – and what not. In this article: how to analyze log files with Screaming Frog, step by step.

1. Why analyze log files?

Where a normal crawl shows what is technically possible, a log file shows what search engines actually do. Consider:

  • Which pages are visited by Googlebot?
  • How often is a particular URL crawled?
  • Crawl time vs. crawl budget: what is lost to irrelevant URLs?
  • Are redirects, 404s or parameter URLs called frequently?

Log analysis = reality. Not the intent.

2. What do you need?

To analyze log files with Screaming Frog, you need the following:

  • A compressed or exported log file from your Web server (Apache: .log, NGINX: .access.log)
  • Or export from a CDN such as Cloudflare / Akamai
  • A Screaming Frog license (log analysis works only in the paid version)

Ask your developer or hosting party for at least 7-30 days of log data.

3. Import log file into Screaming Frog

  1. Open Screaming Frog
  2. Go to Mode > Log File Analyser
  3. Click New Project > Import Log Files
  4. Select your log file(s)
  5. Specify which domain you want to analyze

The tool automatically recognizes user agents, response codes, status codes and timestamps.

Aan de slag met SEO? Neem gerust contact op.

Senior SEO-specialist






    4. Analyze Googlebot activity.

    Go to the Bots tab > Googlebot to see:

    • Which pages were visited most often
    • Status codes of visits (200, 301, 404, 5xx)
    • Last crawl date per page
    • Distribution by directory, subdomain or page type

    You can also filter for other bots: Bingbot, AdsBot, etc.

    5. Key insights you can extract

    InsightAction
    Lots of crawl on 404sSet up redirect or clean up error pages
    Crawl on parameter URLsCustomize robots.txt or set canonical
    Important pages are not visitedImprove or include internal links in sitemap
    Crawl on dev/test URLsExclude via robots.txt or make authentication mandatory
    Lots of crawl on redirectsRedirect chains clean up, promote end URLs in internal links

    6. Combining data with crawl data

    Want to know which pages exist but are not visited?

    1. Perform a normal crawl of your site in Screaming Frog
    2. Go to Log File Analyser > Compare Crawl Data
    3. Combine crawl and log data
    4. Filter by: “Crawled in crawl but not in log file” → potential crawl budget issues

    This provides insight into what Google is missing or ignoring.

    7. Exporting & reporting

    All tables are exportable to Excel/CSV. Useful for:

    • Customer or dev reporting
    • Prioritize technical tickets
    • Provide evidence in crawl budget discussions

    Use the visual “Overview” tab for quick insight into distribution by status code or crawl type.

    In conclusion

    Log file analysis through Screaming Frog gives you data you won’t see anywhere else: how Google is really moving through your site. Combine this with crawl data and you have a powerful foundation for technical SEO decisions.

    Senior SEO-specialist

    Ralf van Veen

    Senior SEO-specialist
    Five stars
    My clients give me a 5.0 on Google out of 85 reviews

    I have been working for 12 years as an independent SEO specialist for companies (in the Netherlands and abroad) that want to rank higher in Google in a sustainable manner. During this period I have consulted A-brands, set up large-scale international SEO campaigns and coached global development teams in the field of search engine optimization.

    With this broad experience within SEO, I have developed the SEO course and helped hundreds of companies with improved findability in Google in a sustainable and transparent way. For this you can consult my portfolio, references and collaborations.

    This article was originally published on 3 June 2025. The last update of this article was on 18 July 2025. The content of this page was written and approved by Ralf van Veen. Learn more about the creation of my articles in my editorial guidelines.