Step 1: Make sure your log files are compliant with OnCrawl requirements
- query path (
/blog/) or full URL(
- vhost, if not already present in the full URL
- scheme (http or https), if not already present in the full URL
- date with time and timezone
- User Agent
- status code
- port of the request (80 on HTTP / 443 on HTTPS)
Optional but highly recommended fields:
- client IP : used to detect fake Googlebot hits.
- size of response in bytes
Sample log line bot hit:
www.oncrawl.com:80 126.96.36.199 - - [07/Feb/2018:17:06:04 +0000] "GET /blog/ HTTP/1.1" 200 14486 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" "-"
Sample log line SEO visit:
www.oncrawl.com:80 188.8.131.52 - - [07/Feb/2018:17:06:04 +0000] "GET /blog/ HTTP/1.1" 200 37073 "https://www.google.es/" "Mozilla/5.0 (Linux; Android 7.0; SM-G920F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.137 Mobile Safari/537.36" "-"
If you host multiple subdomains (for example, https://www.oncrawl.com and https://fr.oncrawl.com) on the same server, make sure that the two subdomains are clearly identified in your logs. One way to do this is to verify that the full URL is used instead of just the query path.
- If your logs do not contain what’s mentioned above please contact your IT and add the fields (otherwise the dashboards might lack of important data)
- If your logs contain what’s mentioned above please move on to the next step
Learn how to configure the right log format when using Apache or Ngnix here
Step 2: Validate your Log Files using OnCrawl Parsing Process
Click on the "ADD LOGS" button
Next, enable the blanks accordingly to your log files formatting (fields)
OnCrawl should automatically process your log files accordingly to what you've chosen.
Step 3: Upload your log files to OnCrawl FTP Account
You have access to your log files and now they are ready to be uploaded into OnCrawl.
First, make sure that your network firewall is open for FTP connections.
Use a FTP client solution (eg FileZilla or any other) to connect to:
Username: 'OnCrawl Username'
Password: 'OnCrawl Password'
- Choose the directory matching the name of your project
- Drop the file(s) into that directory
You're (almost) done.
If you've properly configured Step 2 and your log files posses the right format; then move on to check the Log Manager Tool (Step 4)
Step 4: Monitor the Log Processing from OnCrawl Log Manager Tool
OnCrawl's Log Manager Tool allows you monitoring the processing of your log files.
You can find the "LOG MANAGER TOOL" button where the "ADD LOGS" button used to be.
What sort of information can be monitored?
OnCrawl activity displays information regarding OnCrawl parsing jobs, whereas files Queued for parsing, Currently being parsed, Queued for export, and Currently being exported.
At this first stage, OnCrawl evaluates the data contained in your log files.
Second stage, the Log Manager Tool displays graphs regarding the uploading process and the parsing process.
Finally, the tool shows an explorable data table with information including File Name, Deposit Date, File Size, OK Lines, Erroneous Lines, and Filtered lines.
File Names are clickable and explorable.
Processed Files Legend:
- Having lots of log lines in the columns "Files size" and "Ok Lines" should be fine
- Having a few log lines in the columns "Erroneous lines" and "Warnings" should not be a problem
- On the contrary, having lots of log lines in the columns "Erroneous Lines" and "Warnings" should indicate a parsing error. In that case, contact us using the OnCrawl chat box, we are happy to help.
Step 5: Explore OnCrawl Log Monitoring Dashboards
You are now all set. Click on the "SHOW LOGS MONITORING" and being your logs analysis.
You can also find this article by searching for:
paso a paso cómo comenzar con la configuración de monitoreo
mise en place du suivi des logs, étape par étape