Skip to main content

Proxy Aggregator Frequently Asked Questions

Welcome to the ScrapeOps Proxy Aggregator FAQ page. Here you will find answers to the most common questions users have when using ScrapeOps Proxy Aggregator.

Overview

What Is The ScrapeOps Proxy Aggregator?

The ScrapeOps Proxy Aggregator is a proxy service that simplifies the process of web scraping by providing access to high-performing proxies via a single endpoint.

With ScrapeOps, users do not need to worry about finding the best & cheapest proxy providers manually as we handles that for you. You just send us the URL you want to scrape and we will take care of the rest, returning the HTML response from the website when the request is complete.

How Is The ScrapeOps Proxy Aggregator Different To Other Proxy Providers?

The ScrapeOps Proxy Aggregator is different from other proxy providers in that we aggregate all the other proxy providers together (integrated with the 20 providers listed on this page) and route your requests through the best-performing proxy providers with the lowest cost for your target domains.

Meaning that we take care of finding and optimizing the best proxy selection for your use case, and if a proxy provider goes down then we will automatically route your requests through another proxy provider. So you can focus on building your data feeds and integrating them into your applications.


Getting Started & Usage

How To Use The ScrapeOps Proxy Aggregator?

The ScrapeOps Proxy Aggregator is a tool that provides access to best-performing proxies via a single API or proxy port endpoint. To use it, you need an API key that you can obtain by signing up for a free account here.

You can then send your requests to the ScrapeOps Proxy API or Proxy Port endpoint, including your API key and the URL you want to scrape, via GET or POST requests. The ScrapeOps Proxy takes care of proxy selection and rotation to find the best-performing proxy for your use case.

Here is some example code showing you how to send a GET request to the Proxy API endpoint using Python Requests:


import requests

response = requests.get(
url='https://proxy.scrapeops.io/v1/',
params={
'api_key': 'YOUR_API_KEY',
'url': 'http://httpbin.org/ip',
},
)

print('Body: ', response.content)


Additionally, there is advanced functionality available such as country geotargeting, premium proxies, residential proxies, mobile proxies, anti-bot bypasses, waits, and more.

For more information, check out our Getting Started Guide.

What Programming Languages Can The ScrapeOps Proxy Aggregator Work With?

The ScrapeOps proxy aggregator can work with all major programming languages, including Python, NodeJs, Golang, etc. as in is integratable via its HTTP Proxy API endpoint or its Proxy Port.

The ScrapeOps Proxy supports GET and POST requests. For users who has web scrapers that already integrates into existing proxy pools, ScrapeOps offers a proxy port solution.

Check out the Integration Examples section of our documentation to see examples of how to integrate the ScrapeOps Proxy Aggregator into various types of web scrapers.


Proxy Performance & Errors

Dealing With 500 Errors

When encountering a 500 error, this typically implies either a server-side error or the underlying proxy providers where not able to get the page data, meaning the problem is generally not with your request but with the server responding to the request.

Also you might received a 500 error when ScrapeOps tried with multiple providers and did not receive the desired data. In such cases, you should wait for a moment and then try again, as these errors might be transient. Note that you are not charged any API credit for requests that result in a 500 error message.

You may also check the ScrapeOps Proxy Aggregator dashboard to see if there are any ongoing issues or contact support for more assistance. Detailed logging of your requests can also help identify any patterns or specific requests that might be causing the error.

How To Fix Slow Responses

If you are experiencing slow responses, consider the following strategies to optimize your requests and enhance the response time:

  • Optimize Request Parameters: Remove unnecessary parameters from your requests, such as render_js, unless the website heavily relies on JavaScript for rendering content. This can help in reducing the load and potentially speed up the response time.

  • Adjust Wait Time:The ScrapeOps API allows users to pass a wait_time parameter, enabling the scroller to pause, allowing ample time for page elements to load or new pages to render before the scraping occurs. Properly adjusting this parameter ensures efficient waiting and avoids unnecessary delays, especially when dealing with dynamically loaded content.

  • Leverage Premium Proxies: If you’re still facing issues, consider using the premium=true option, which may offer better and faster responses. However, please note that this option requires 1.5 API credits per request, implying a higher cost due to the enhanced service level.

For a deeper understanding of optimizing your requests and reducing response times, refer to our comprehensive documentation. Remember, optimizing your requests not only ensures faster response times but also helps in efficient consumption of API credits, thus being cost-effective in the long run.

200 Successful Response But Data Isn't In HTML

Receiving a 200 status code signifies a successful response, but if the data isn’t in HTML, there may be some content rendering issues. Check if the targeted website relies heavily on JavaScript for content rendering. If so, consider enabling JavaScript rendering in your ScrapeOps requests. This can be achieved by incorporating the render_js=true parameter within your request. For a more comprehensive understanding and for additional advanced features, please refer to our detailed guide here.

Additionally, inspect the received data, as it might be in a different format like JSON or XML. Adjusting the request headers to specify the desired content-type can also help in receiving the data in the correct format.

Getting Bad Data Back As 200 Successful Responses

Receiving unexpected or ‘bad’ data, or banned pages, even with a 200-successful response, might be due to the website serving different content to different user agents or IP addresses, possibly to deter scraping. In such scenarios, consider the following approaches:

  • Use Residential Proxies: Leveraging residential proxies can help in receiving more accurate and reliable data this can be enabled with the residential=true parameter review our guide for more details.
  • Review the Response: Examine the received data carefully for any clues or patterns that might help in understanding the discrepancies in the data received.
  • Contact Support: For persistent issues, get in touch with the support team for more nuanced insights and resolutions.

Remember to constantly monitor the dashboard for real-time updates on your usage and the status of the ScrapeOps Proxy Aggregator, and refer to the comprehensive documentation for more detailed information on optimizing your web scraping operations. For a more comprehensive understanding and for additional advanced features, please refer to our detailed guide here.


Plans & Billing

How Much Does 1 Request Cost?

The ScrapeOps Proxy Aggregator uses an API credit system to measure usage where you are only charged for successful responses.

The number of API credits used varies from 1 to 70 per request depending on the functionality and domain being scraped.

For most use cases, scraping 1 page will consume 1 API credit. However, if you enable advanced functionality like Javascript Rendering or are scraping heavily protected websites then it will cost more API credits per page.

For a full breakdown of request costs, then check out the Request Costs documentation.

Free Plan

ScrapeOps offers a free plan of 1,000 free API credits per month (with a maximum of 1 concurrent connections) to test the Proxy Aggregator and for very small scraping projects.

What happens if I run out of credits before the end of my current subscription?

If you run out of API Credits before your plan's renewal date one of three things can happen depending on what setting you set:

  • Auto-Renew: If you enable the auto-renew setting then when you consume all your API credits, your Proxy Aggregator plan will be automatically renewed. So your used API credits will be reset to zero and renewal date will be reset to that date plus 1 month.
  • Auto-Upgrade: If you enable the auto-upgrade setting then when you consume all your API credits, your Proxy Aggregator plan will be upgraded to the next plan up. So your API credit limit will be increased to the new limit and your consumed credits and next renewal date will remain the same.
  • Blocked Account: If you enable the do-nothing setting then when you consume all your API credits your Proxy Aggregator plan will be blocked and it will return 401 error codes anytime you send a request until your plan is renewed or upgraded.

Canceling A Plan

You can cancel your subscription at any time in your dashboard by going to the settings tab.

Getting A Refund

We offer a 7-day refund policy, if you are unhappy with the service. Contact support and we will refund you right away.

Change Credit Card Details

You can change your card details anytime on the Billing page in your dashboard. Located within the settings tab.

Removing Credit Card Details

You can remove your card details anytime on the Billing page in your dashboard. Located within the settings tab.

Pay-As-You-Go Option

Currently, we don’t offer a pay-as-you-go option with the Proxy Aggregator. All our plans are monthly subscriptions that reset each month.

Bandwidth Based Pricing

Currently, we don’t offer bandwidth-based pricing. All our plans are based on the number of requests you make to the API each month.

Buying Individual IPs

Currently, we don’t have the option to purchase individual proxies from our pools.


Dashboard & Stats

Proxy Stats API

At the moment we don't have a publicly available API for the stats displayed on the proxy dashboard but it is something that we have on our roadmap.