Perform log file analysis with Screaming Frog

Many SEO specialists use Screaming Frog for crawls, but forget that the program is also excellent for log file analysis. It gives you insight into what search engines are really crawling your site – and what not. In this article: how to analyze log files with Screaming Frog, step by step.
1. Why analyze log files?
Where a normal crawl shows what is technically possible, a log file shows what search engines actually do. Consider:
- Which pages are visited by Googlebot?
- How often is a particular URL crawled?
- Crawl time vs. crawl budget: what is lost to irrelevant URLs?
- Are redirects, 404s or parameter URLs called frequently?
Log analysis = reality. Not the intent.
2. What do you need?
To analyze log files with Screaming Frog, you need the following:
- A compressed or exported log file from your Web server (Apache: .log, NGINX: .access.log)
- Or export from a CDN such as Cloudflare / Akamai
- A Screaming Frog license (log analysis works only in the paid version)
Ask your developer or hosting party for at least 7-30 days of log data.
3. Import log file into Screaming Frog
- Open Screaming Frog
- Go to Mode > Log File Analyser
- Click New Project > Import Log Files
- Select your log file(s)
- Specify which domain you want to analyze
The tool automatically recognizes user agents, response codes, status codes and timestamps.
Aan de slag met SEO? Neem gerust contact op.

4. Analyze Googlebot activity.
Go to the Bots tab > Googlebot to see:
- Which pages were visited most often
- Status codes of visits (200, 301, 404, 5xx)
- Last crawl date per page
- Distribution by directory, subdomain or page type
You can also filter for other bots: Bingbot, AdsBot, etc.
5. Key insights you can extract
Insight | Action |
Lots of crawl on 404s | Set up redirect or clean up error pages |
Crawl on parameter URLs | Customize robots.txt or set canonical |
Important pages are not visited | Improve or include internal links in sitemap |
Crawl on dev/test URLs | Exclude via robots.txt or make authentication mandatory |
Lots of crawl on redirects | Redirect chains clean up, promote end URLs in internal links |
6. Combining data with crawl data
Want to know which pages exist but are not visited?
- Perform a normal crawl of your site in Screaming Frog
- Go to Log File Analyser > Compare Crawl Data
- Combine crawl and log data
- Filter by: “Crawled in crawl but not in log file” → potential crawl budget issues
This provides insight into what Google is missing or ignoring.
7. Exporting & reporting
All tables are exportable to Excel/CSV. Useful for:
- Customer or dev reporting
- Prioritize technical tickets
- Provide evidence in crawl budget discussions
Use the visual “Overview” tab for quick insight into distribution by status code or crawl type.
In conclusion
Log file analysis through Screaming Frog gives you data you won’t see anywhere else: how Google is really moving through your site. Combine this with crawl data and you have a powerful foundation for technical SEO decisions.