All Collections
How to get started
Video tutorials
[Tutorial] How to use the OnCrawl Data Explorer
[Tutorial] How to use the OnCrawl Data Explorer
OnCrawl's Data Explorer is a powerful tool that allows you to create downloadable, custom reports about your URLs
Rebecca Berbel avatar
Written by Rebecca Berbel
Updated over a week ago

Video transcription

Welcome to OnCrawl's Getting Started Tutorials. Today we're going to look at how to use our powerful Data Explorer to use your data to answer technical SEO questions. We will be using crawl results that have been linked to data from logs and Google Analytics.

You can find the Data Explorer in the crawl report Tools.

You can explore different types of data: Pages, links, comparisons with other crawls, and data from log monitoring.

The OnCrawl Query Language filters

let you zoom in on any type of data.

You can use our pre-set quickfilters, or save your own.

Let's say you have a question, such as: "Which orphan pages should I reintegrate into my site architecture, and which should I assign a 410 status to?" Since you don't want to return a 410 status -- which means the page is permanently removed -- on pages that still receive bot hits or SEO sessions, exploring data related to orphan pages to will help answer this question.

There's a chart for orphan pages in sitemaps. Let's start with that.

When you click on a chart in OnCrawl, you go directly to the Data Explorer with the filter used to produce the data in the chart.

This result is a little narrow, since it only lists orphan pages we know of from looking at sitemaps.

If we remove the sitemap requirement, we can see all orphan pages, no matter their origin.

Here are all our orphan pages. We're looking for those that still have SEO traffic. 

We can add filters for SEO sessions and for bot hits from log data.

These are the ### orphan pages that should be integrated back into the site structure.

You can save this query for later by clicking on [button]

All Data Explorer reports are 100% customizable. For example, because these are orphan pages, they haven't been crawled. You may not want to keep blank columns in the report.

You can sort by a column in both directions to make sure it's entirely blank.

And then remove it from the report.

You may want to add additional information by adding columns, for example, the source where OnCrawl discovered this URL.

If you want to see how many of these orphan pages are discovered through log monitoring, you can filter the Sources column.

To help you out with the next steps of your analysis, there are quick links here to get more information about a specific URL

...or you can export the entire list as a CSV, like this.

OnCrawl's data explorer: simple and powerful.

Questions? Reach out to us from the OnCrawl interface by clicking on the blue Intercom button at the bottom of the screen, or tweet to us at OnCrawl_CS.

See you next time!

Until then, happy crawling.

Did this answer your question?