Skip to main content

Introduction

The following is documentation on how to setup and use ScrapeOps with your scrapers.

💻 Demo

🔗 ScrapeOps Dashboard Demo

⭐ Features

  • Scrapy Job Stats & Visualisation

    • 📈 Individual Job Progress Stats
    • 📊 Compare Jobs versus Historical Jobs
    • 💯 Job Stats Tracked
      • ✅ Pages Scraped & Missed
      • ✅ Items Parsed & Missed
      • ✅ Item Field Coverage
      • ✅ Runtimes
      • ✅ Response Status Codes
      • ✅ Success Rates & Average Latencies
      • ✅ Errors & Warnings
      • ✅ Bandwidth
  • Health Checks & Alerts

    • 🔍 Custom Spider & Job Health Checks
    • 📦 Out of the Box Alerts - Slack (More coming soon!)
    • 📑 Daily Scraping Reports
  • ScrapyD Cluster Management

    • 🔗 Integrate With ScrapyD Servers
    • ⏰ Schedule Periodic Jobs
    • 💯 All Scrapyd JSON API Supported
    • 🔐 Secure Your ScrapyD with BasicAuth, HTTPS or Whitelisted IPs

🚀 Getting Started

To use ScrapeOps you first need to create a free account and get your free API_KEY.

Currently ScrapeOps integrates with both Python Requests & Python Scrapy scrapers:

  1. Python Requests Integration
  2. Python Scrapy Integration