Proxy Aggregator Frequently Asked Questions
Welcome to the ScrapeOps Proxy Aggregator FAQ page. Here you will find answers to the most common questions users have when using ScrapeOps Proxy Aggregator.
Overview
What Is The ScrapeOps Proxy Aggregator?
The ScrapeOps Proxy Aggregator is a proxy service that simplifies the process of web scraping by providing access to high-performing proxies via a single endpoint.
With ScrapeOps, users do not need to worry about finding the best & cheapest proxy providers manually as we handles that for you. You just send us the URL you want to scrape and we will take care of the rest, returning the HTML response from the website when the request is complete.
How Is The ScrapeOps Proxy Aggregator Different To Other Proxy Providers?
The ScrapeOps Proxy Aggregator is different from other proxy providers in that we aggregate all the other proxy providers together (integrated with the 20 providers listed on this page) and route your requests through the best-performing proxy providers with the lowest cost for your target domains.
Meaning that we take care of finding and optimizing the best proxy selection for your use case, and if a proxy provider goes down then we will automatically route your requests through another proxy provider. So you can focus on building your data feeds and integrating them into your applications.
Getting Started & Usage
How To Use The ScrapeOps Proxy Aggregator?
The ScrapeOps Proxy Aggregator is a tool that provides access to best-performing proxies via a single API or proxy port endpoint. To use it, you need an API key that you can obtain by signing up for a free account here.
You can then send your requests to the ScrapeOps Proxy API or Proxy Port endpoint, including your API key and the URL you want to scrape, via GET or POST requests. The ScrapeOps Proxy takes care of proxy selection and rotation to find the best-performing proxy for your use case.
Here is some example code showing you how to send a GET request to the Proxy API endpoint using Python Requests:
import requests
response = requests.get(
url='https://proxy.scrapeops.io/v1/',
params={
'api_key': 'YOUR_API_KEY',
'url': 'http://httpbin.org/ip',
},
)
print('Body: ', response.content)
Additionally, there is advanced functionality available such as country geotargeting, premium proxies, residential proxies, mobile proxies, anti-bot bypasses, waits, and more.
For more information, check out our Getting Started Guide.
What Programming Languages Can The ScrapeOps Proxy Aggregator Work With?
The ScrapeOps proxy aggregator can work with all major programming languages, including Python, NodeJs, Golang, etc. as in is integratable via its HTTP Proxy API endpoint or its Proxy Port.
The ScrapeOps Proxy supports GET and POST requests. For users who has web scrapers that already integrates into existing proxy pools, ScrapeOps offers a proxy port solution.
Check out the Integration Examples section of our documentation to see examples of how to integrate the ScrapeOps Proxy Aggregator into various types of web scrapers.
Proxy Performance & Errors
Dealing With 500 Errors
How To Fix Slow Responses
200 Successful Response But Data Isn't In HTML
Getting Bad Data Back As 200 Successful Responses
Plans & Billing
How Much Does 1 Request Cost?
The ScrapeOps Proxy Aggregator uses an API credit system to measure usage where you are only charged for successful responses.
The number of API credits used varies from 1 to 70 per request depending on the functionality and domain being scraped.
For most use cases, scraping 1 page will consume 1 API credit. However, if you enable advanced functionality like Javascript Rendering or are scraping heavily protected websites then it will cost more API credits per page.
For a full breakdown of request costs, then check out the Request Costs documentation.
Free Plan
ScrapeOps offers a free plan of 1,000 free API credits per month (with a maximum of 1 concurrent connections) to test the Proxy Aggregator and for very small scraping projects.
What happens if I run out of credits before the end of my current subscription?
If you run out of API Credits before your plan's renewal date one of three things can happen depending on what setting you set:
- Auto-Renew: If you enable the
auto-renew
setting then when you consume all your API credits, your Proxy Aggregator plan will be automatically renewed. So your used API credits will be reset to zero and renewal date will be reset to that date plus 1 month. - Auto-Upgrade: If you enable the
auto-upgrade
setting then when you consume all your API credits, your Proxy Aggregator plan will be upgraded to the next plan up. So your API credit limit will be increased to the new limit and your consumed credits and next renewal date will remain the same. - Blocked Account: If you enable the
do-nothing
setting then when you consume all your API credits your Proxy Aggregator plan will be blocked and it will return401
error codes anytime you send a request until your plan is renewed or upgraded.
Canceling A Plan
You can cancel your subscription at any time in your dashboard by going to the settings tab.
Getting A Refund
We offer a 7-day refund policy, if you are unhappy with the service. Contact support and we will refund you right away.
Change Credit Card Details
You can change your card details anytime on the Billing page in your dashboard. Located within the settings tab.
Removing Credit Card Details
You can remove your card details anytime on the Billing page in your dashboard. Located within the settings tab.
Pay-As-You-Go Option
Currently, we don’t offer a pay-as-you-go option with the Proxy Aggregator. All our plans are monthly subscriptions that reset each month.
Bandwidth Based Pricing
Currently, we don’t offer bandwidth-based pricing. All our plans are based on the number of requests you make to the API each month.
Buying Individual IPs
Currently, we don’t have the option to purchase individual proxies from our pools.
Dashboard & Stats
Proxy Stats API
At the moment we don't have a publicly available API for the stats displayed on the proxy dashboard but it is something that we have on our roadmap.