Skip to main content

Server Manager & Scheduler FAQ

Welcome to the ScrapeOps Server Manager & Scheduler FAQ page. Here you will find answers to the most common questions users have when using the ScrapeOps Server Manager & Scheduler.

Frequently Asked Questions

Overview

Integration Methods

Getting Started & Usage

Features

Plans & Billing


Overview

What Is The Server Manager & Scheduler?

The ScrapeOps Server Manager & Scheduler is a DevOps tool that lets you deploy, manage, and schedule your web scrapers from the ScrapeOps dashboard.

Instead of manually SSH-ing into servers or setting up cron jobs, you can:

  • Connect your servers via SSH or Scrapyd
  • Deploy scrapers directly from GitHub
  • Schedule periodic scraping jobs
  • Monitor all your scrapers from one dashboard

It's designed to work alongside ScrapeOps Monitoring, giving you a complete solution for managing your scraping infrastructure.

What Server Providers Are Supported?

The Server Manager works with any server that supports SSH connections, including:

ProviderIntegrationStatus
Digital OceanSSH✅ Available
AWS EC2SSH✅ Available
VultrSSH✅ Available
Any SSH ServerSSH✅ Available
ScrapydHTTP API✅ Available

If you can SSH into your server, you can connect it to ScrapeOps.

What Types Of Scrapers Can I Deploy?

With the SSH integration, you can deploy and run any type of scraper:

  • Python (Scrapy, Requests, BeautifulSoup, etc.)
  • Node.js (Puppeteer, Playwright, Cheerio, etc.)
  • Any other language or framework

The Scrapyd integration is specifically for Python Scrapy projects.


Integration Methods

What Is The SSH Server Integration?

The SSH server integration allows you to connect any SSH-capable server to ScrapeOps. Once connected, you can:

  • Deploy scrapers directly from GitHub to your servers
  • Schedule periodic jobs
  • Manage your scraping jobs from the dashboard

This is the recommended integration method as it supports all scraper types (Python, Node.js, etc.).

To set up SSH integration, follow our guides:

What Is The Scrapyd Server Integration?

The Scrapyd integration connects directly to Scrapyd servers via their HTTP API. It supports:

  • Full Scrapyd JSON API compatibility
  • Scheduling periodic jobs
  • Secure connections with BasicAuth, HTTPS, or IP whitelisting

This integration is only applicable to Python Scrapy projects running on Scrapyd.

To set up Scrapyd integration, follow our Scrapyd Integration Guide.

Which Integration Method Should I Use?

Use CaseRecommended Integration
Python Scrapy onlyEither SSH or Scrapyd
Multiple scraper types (Python, Node.js, etc.)SSH
Need GitHub deploymentSSH
Already running ScrapydScrapyd
Maximum flexibilitySSH

SSH Integration is recommended for most users as it offers the most flexibility and supports all scraper types.


Getting Started & Usage

How Do I Connect My Server To ScrapeOps?

  1. Sign up for a free ScrapeOps account
  2. Navigate to the Servers page in the dashboard
  3. Click Add Server and follow the setup wizard
  4. For SSH: Provide your server's IP address and SSH credentials
  5. For Scrapyd: Provide your Scrapyd server URL and authentication details

Detailed setup guides:

Can I Deploy Scrapers From GitHub?

Yes! With the SSH server integration, you can connect your GitHub repository and deploy scrapers directly to your servers.

The workflow is:

  1. Connect your GitHub repo to ScrapeOps
  2. Select the server to deploy to
  3. ScrapeOps pulls the code and sets up the scraper
  4. Schedule or run jobs from the dashboard

This makes it easy to keep your scrapers up to date - just push to GitHub and redeploy.

How Does Job Scheduling Work?

Job scheduling lets you run scrapers automatically on a recurring basis. You can set up periodic schedules so your scrapers run at defined intervals.

Scheduled jobs integrate with ScrapeOps Monitoring, so you can track performance and get alerts when something goes wrong.

Can I Manage Scrapers On Multiple Servers?

Yes! ScrapeOps allows you to connect multiple servers and manage all your scraping jobs from one central dashboard, regardless of where they're running.

This is available on Business and Enterprise plans.


Features

How Do I Monitor Scheduled Jobs?

Scheduled jobs integrate with ScrapeOps Monitoring. When a job runs, you can see:

  • Real-time progress and stats
  • Comparison against historical runs
  • Errors and warnings
  • Alerts if something looks wrong

Check out the Monitoring FAQ for more details on monitoring features.


Plans & Billing

Free Plan

The Community Plan is free forever and includes basic monitoring features. Job scheduling and server management features are available on premium plans.

What Plans Include Job Scheduling?

Job scheduling and server management are available on premium plans. Check the Monitoring & Scheduling page for current pricing details.

The free Community Plan focuses on monitoring only.

Canceling A Plan

You can cancel your subscription at any time in your dashboard by going to the settings tab.