All Collections
Log Monitoring
Video tutorials
[Tutorial] How to combine crawl and log data
[Tutorial] How to combine crawl and log data

Use cross-data analysis on crawl and log data to find active orphan pages and other interesting information.

Updated over a week ago

Video Transcription

Welcome to OnCrawl's Getting Started Tutorials. Today we're going to look at cross-data analysis between log data and crawl data.

You may be interested in how technical SEO impacts page performance. For example, the crawl picks up depth distribution for your website.

When we look at organic and bot hits from log files, it can reveal how depth influences activity on the website.

You can find analyses from log monitoring here (log monitoring) and here (SEO impact report).

Today, we're going to look at page depth.

On this site, there's a pretty clear relationship between depth and activity.

Deep pages on this site have almost no hits.

And the deeper we go, the lower the percentage of pages that are responsible for the visits at that level.

Incidentally, we mostly see a similar pattern with bot hits.

Let's say we want to examine this peak of activity at a depth of 6. Maybe it's something we can use to place these pages higher in the site structure or to move visitors from these pages to more important pages.

Click on the segment of the chart that interests you to go directly to the Data Explorer.

I'm interested in the pages that earn the most organic traffic, so I'm going to sort by SEO visits per URL.

This shows me the top URLs at this depth, which might be useful if I have a lot of pages here.

I might want to export this list...

...or explore all data collected for this URL.

Cross-data analysis with OnCrawl is quick and easy!

Questions? Reach out to us from the OnCrawl interface by clicking on the blue Intercom button at the bottom of the screen, or tweet to us at OnCrawl_CS.

See you next time!

Until then, happy crawling.

Did this answer your question?