Video Transcription

In this tutorial we're going to see how to cross crawl and log data to get interesting information.

Let's say you want to know if you have active orphan pages that are crawled by Google. This is an interesting information because those pages actually drive SEO traffic but are not attached to the website structure.

So to do that, all you have to do is add fields:

Crawled by google - is - true


Then you add:

Depth - is - unknown


And finally add:

Has SEO visits - is - true


Then you click on "Apply filters".

So you see that you have a lot of those pages on your website, which is quite interested as they could create extra SEO value.

You could also check to see if you have a couple of error pages in 404 that could be fixed and reattached to the structure.

So this can happen after website restructuring when web pages become orphans and excluded from the server but they are still accessible from the SERPs without existing for real.
So if you want to know this information, you just have to add an extra field, which is:

Googlebot status code - category - client error

So you can apply it, and you see that you have nine of these pages.

You can save your filter.
Let's say: "404 active orphan pages that need to be fixed".
You can save it if you want to reuse it later and you can export this data.

As you can see, that's pretty simple and pretty fast to know this information.

If you need any other information or if you have any questions, don't hesitate. Send us a tweet or an email (or click the blue Intercom button at the bottom right of your screen), and we'll be very happy to help.

Did this answer your question?