Monitoring Overview
The following is documentation on how to setup and use ScrapeOps with your scrapers.
Monitoring Product Updates
For the latest features and product changes, see the Monitoring Product Updates.
💻 Demo
⭐ Features
-
Scrapy Job Stats & Visualisation
- 📈 Individual Job Progress Stats
- 📊 Compare Jobs versus Historical Jobs
- 💯 Job Stats Tracked
- ✅ Pages Scraped & Missed
- ✅ Items Parsed & Missed
- ✅ Item Field Coverage
- ✅ Runtimes
- ✅ Response Status Codes
- ✅ Success Rates & Average Latencies
- ✅ Errors & Warnings
- ✅ Bandwidth
-
Health Checks & Alerts
- 🔍 Custom Spider & Job Health Checks
- 📦 Out of the Box Alerts - Slack (More coming soon!)
- 📑 Daily Scraping Reports
🚀 Getting Started
To use ScrapeOps you first need to create a free account and get your free API_KEY.
Currently ScrapeOps integrates with both Python Requests & Python Scrapy scrapers:
More ScrapeOps Monitoring integrations are on the way.