All Collections
General information
General information

General information about how to use Oncrawl

38 articles
ProjectsWhat is a project in Oncrawl and how to manage projects
Auditing Core Web VitalsOptimize Core Web Vitals using Oncrawl, understanding their role in Google's algorithms and impact on rankings.
Monthly URL quota: what it means in OncrawlOncrawl plans function on the basis of monthly quotas. Learn what it means and how to manage it effectively.
How to use dashboard templatesUse dashboard templates to customize your analysis results and easily add new or subject-specific dashboards to your analyses.
Domain ownership verification (Verified domains)Discover Oncrawl's domain ownership policy and how you should verify your own domains in order to unlock advanced crawl settings.

What files are considered to be "resources" in Oncrawl?When the crawler or log analyzer encounters a reference to certain types of files, it groups them as "resources". Here's the complete list.
GA4 Metrics: Understanding them for enhanced data analysisHarness GA4 metrics in Oncrawl for thorough cross-analysis with crawl data, exploring metrics, reports, and filters.
How to check if Oncrawl services are up or downCheck whether everything's working as expected and receive maintenance notifications
Archived crawls and how to un-archive themOncrawl archives old crawls. The data's still there. Here's how to unarchive all or part of an archived crawl.
Rename a projectNeed to change the name of a project? It's easy.
Invite a new member to a workspaceHow to invite someone to view or collaborate in Oncrawl
How to Open and Work with CSV Files in Excel or SheetsHandle CSV files effectively in Excel/Sheets with insights from Oncrawl's Data Explorer and enhance your data management skills.
How to crawl a site with a robots.txt crawl-delay greater than one secondOncrawl supports the crawl-delay directive of the robots.txt file. Discover how to utilize the functionality.
How to modify crawl limits while crawlingHow to change crawl speed and max crawl depth after launching your crawl
How does Oncrawl calculate load time?This article explores what is load time and how it is calculated in Oncraw.
How to quickly scan and analyze a list of URLs?Sometimes you may need to analyze a more or less important list of URLs without browsing the entire site. Here is how to proceed.
What format can I use for my sitemaps?You can use sitemaps to compare and add data to a crawl. Here are the formats Oncrawl supports.
How to explore all of the URLs in my sitemapSometimes you just want data on the URLs in your sitemap. It's possible to do this using a list of URLs with Oncrawl.
How to check URLs in a sitemapHere is how to make sure that your sitemaps will be taken into account during the analysis
How can I find the target of a 3xx (redirection) status code?3xx status codes indicate a redirected page. Here’s how to find the page that was redirected, as well as the page it was redirected to.
How to use crawl results to find redirection loopsYour crawl results can help you to quickly produce a list of all redirected pages and the pages to which they are redirected.
How to create a file listing all links pointing to a 301 URL, the old URL, and the new URLYou have permanently redirected (301) URLs. Here's how to create a list of all the links, the URL, and the URL they are redirected to.
How to crawl a staging/pre-prod websiteIt's good practice to protect staging websites and pre-production websites from bots. Here's how to crawl one before it goes live.
Why is my data different between Oncrawl and Google Analytics (GA4)Are you seeing differences between numbers in Google Analytics (GA4) and numbers in Oncrawl, when looking at your GA4 data?
How to bypass geographic redirects by crawling with HTTP headersCrawl problems because your site redirects the Oncrawl bot because it doesn't have the right location-based cookie? Here's how to fix it.
Migrating from GA Universal Analytics to GA4 in OncrawlAre you replacing your GA-UA account with GA4? Here's everything you need to do to update your data sources in Oncrawl.
Manage workspace members with access to a projectHow to check or modify the workspace members with access to a specific project

Link DatasetUnderstanding the information collected by Oncrawl about a website's links
How to use REGEX in OncrawlUse pattern detection in fields to get to the essentials faster. Use regular expressions to create filters (Data Explorer & Segmentations)
Mastering SEO alerts: How to set-up and monitor them with OncrawlLearn how to establish and manage SEO alerts with Oncrawl, monitoring metrics like 404s, redirects, and duplicate titles proactively.
Lucene REGEX Cheat SheetIn this article you'll learn how to use Regex with Oncrawl. This article is based on the Elastic Search Article
Custom dashboardsMake your own dashboards to track key issues in one place
Crawl over crawlA crawl over crawl compares the results of two crawls to show you the impact of your changes or how two sites differ.
Data scraping and custom fieldsData scraping is an option that allows you to analyse a portion of the source code extracted during a crawl using custom fields.
How to crawl a site that uses JavaScriptIf all or part of your website is built using JavaScript (JS), you may need to render pages in order to for a bot to be able to crawl them.
Customize your user-agent in OncrawlLearn how to change the bot identity used to crawl your website.
Getting Started with the Oncrawl APIOncrawl is based on a platform built around an API. Create your own application to request this API very easily. Here's how to do it.
Oncrawl connector for Looker StudioAnswers to your questions about the Oncrawl Looker Studio (ex Data Studio) connector