Skip to main content

ScrapeOps MCP Server

The ScrapeOps MCP Server exposes the ScrapeOps Proxy API to MCP-compatible IDEs (Cursor, Claude Desktop, VS Code, Windsurf). It gives AI agents first-class scraping and browsing tools with proxy support, anti-bot bypass, JavaScript rendering, screenshots, and LLM-powered extraction.

MCP Server Product Updates

For the latest features and product changes, see the MCP Server Product Updates.


Capabilities

  • Web browsing via the ScrapeOps Proxy API (with geo-targeting and residential/mobile IPs)
  • Anti-bot bypass levels (Cloudflare, DataDome, PerimeterX, generic levels)
  • JavaScript rendering for dynamic sites
  • Screenshots (base64)
  • Structured extraction (auto or LLM schemas) to return clean JSON
  • Optimize-request logic to auto-test proxies and reuse the best-performing ones per site

Supported IDEs

IDEConfigurationStatus
CursorSettings → Features → MCP Servers✅ Available
Claude Desktopclaude_desktop_config.json✅ Available
VS CodeUser Settings JSON✅ Available
Windsurf./codeium/windsurf/model_config.json✅ Available

Requirements


Install & Run

Run the MCP server directly with npx:

env SCRAPEOPS_API_KEY=YOUR_API_KEY npx -y @scrapeops/mcp

Or install globally for repeated use:

npm install -g @scrapeops/mcp
scrapeops-mcp # uses SCRAPEOPS_API_KEY from env

HTTP/SSE mode (local server)

To run the server over HTTP/SSE instead of stdio:

export PORT=8080
export SCRAPEOPS_API_KEY=YOUR_API_KEY
scrapeops-mcp # or npm start
# SSE endpoint: http://localhost:8080/sse

Configure Clients

Add the ScrapeOps MCP server to your preferred IDE:

Cursor

Cursor has built-in MCP support via SettingsFeaturesMCP Servers.

Open Settings, navigate to Features, then MCP Servers, and click "Add new MCP Server" to enter the configuration below.

  • Command: npx
  • Args: -y, @scrapeops/mcp
  • Env: SCRAPEOPS_API_KEY=YOUR_API_KEY

Claude Desktop

Claude Desktop supports MCP servers via claude_desktop_config.json.

Locate your config file (on macOS: ~/Library/Application Support/Claude/claude_desktop_config.json, on Windows: %APPDATA%\Claude\claude_desktop_config.json) and add the following configuration.

{
"mcpServers": {
"@scrapeops/mcp": {
"command": "npx",
"args": ["-y", "@scrapeops/mcp"],
"env": { "SCRAPEOPS_API_KEY": "YOUR_API_KEY" }
}
}
}

VS Code

VS Code supports MCP servers via User Settings JSON.

Open the Command Palette (Cmd+Shift+P / Ctrl+Shift+P), search for "Preferences: Open User Settings (JSON)", and add the following configuration.

{
"mcp": {
"inputs": [
{ "type": "promptString", "id": "apiKey", "description": "ScrapeOps API Key", "password": true }
],
"servers": {
"scrapeops": {
"command": "npx",
"args": ["-y", "@scrapeops/mcp"],
"env": { "SCRAPEOPS_API_KEY": "${input:apiKey}" }
}
}
}
}

Windsurf

Windsurf supports MCP servers via ./codeium/windsurf/model_config.json.

Create or edit the config file in your home directory at ~/.codeium/windsurf/model_config.json and add the following configuration.

{
"mcpServers": {
"@scrapeops/mcp": {
"command": "npx",
"args": ["-y", "@scrapeops/mcp"],
"env": { "SCRAPEOPS_API_KEY": "YOUR_API_KEY" }
}
}
}

Available Tools

The ScrapeOps MCP Server provides the following tools:

ToolDescription
maps_webBrowse any URL with proxy support, JS rendering, screenshots
extract_dataStructured data extraction (auto or LLM schema-based)
return_linksExtract and normalize all links on a page

maps_web (browsing)

  • Browse any URL with proxy support.
  • Options: country, residential, mobile, premium (level_1 | level_2), bypass_level, render_js, wait, wait_for, scroll, screenshot, device_type, keep_headers, follow_redirects.
  • Behavior: starts simple; if blocked, you can retry with stronger settings (e.g., residential + bypass).

extract_data (structured data)

  • Modes: auto (domain parsers) or llm (schema-based).
  • Schemas: product_page, product_search_page, job_search_page, etc.
  • Options: bypass_level, render_js, wait_for, country, residential, mobile, premium.
  • Use when you want JSON instead of raw HTML.
  • Extracts and normalizes all links on a page (pages vs assets).
  • Converts relative URLs to absolute; filters mailto/tel/data/javascript.
  • Options: same proxy/bypass fields (country, residential, mobile, premium, bypass_level) to crawl protected pages reliably.

Optimize Request Flags

  • Automatically runs lightweight tests to find working proxies for a target site.
  • Reuses the successful proxy settings on subsequent requests to improve reliability and cost.

Error Handling (high level)

  • Returns clear errors (e.g., invalid key, forbidden, rate limits) with suggestions.
  • Network errors can be retried; advanced parameters (residential, premium, bypass) are user-controlled.

Local HTTP/SSE usage

You can run the MCP server over HTTP/SSE instead of stdio:

export PORT=8080
export SCRAPEOPS_API_KEY=YOUR_API_KEY
scrapeops-mcp # or npm start
# SSE endpoint: http://localhost:8080/sse

Then point your MCP client to http://localhost:8080/sse and pass the API key via header scrapeops-api-key or environment.