Skip to main content

Getting Started

ScrapeOps Residential Proxy Aggregator is an easy to use proxy that gives you access to the best performing proxies via a single endpoint. We take care of finding the best proxies, so you can focus on the data.

BETA: Residential & Mobile Proxy Aggregator

The Residential Proxy Aggregator is current in beta so there might be still some bugs and not all the functionality has been enabled at the moment. If notice any bugs or you require specific functionality then just let our support team know and we can look into it for you.

Authorisation: API Key

To use the ScrapeOps proxy, you first need an API key which you can get by signing up for a free account here.

Your API key must be included with every request using the password proxy port parameter otherwise the proxy port will return a 403 Forbidden Access status code.


Integration Method: Proxy Port

To make requests you need send the URL you want to scrape to set their proxy port to the ScrapeOps Residential Proxy Port http://scrapeops:YOUR_API_KEY@residential-proxy.scrapeops.io:8181

The username for the proxy is scrapeops and the password is your API key.


curl -x "http://scrapeops:YOUR_API_KEY@residential-proxy.scrapeops.io:8181" "https://httpbin.org/ip"


Here are the individual connection details:

  • Proxy: residential-proxy.scrapeops.io
  • Port: 8181
  • Username: scrapeops
  • Password: YOUR_API_KEY

Below we have an example of how you would use our proxy port with Python Requests.


import requests

proxies = {
"http": "http://scrapeops:YOUR_API_KEY@residential-proxy.scrapeops.io:8181"
}
response = requests.get('https://httpbin.org/ip', proxies=proxies, verify=False)
print(response.text)

SSL Certificate Verification

Note: So that we can properly direct your requests through the proxy port, your code must be configured to not verify SSL certificates.

The ScrapeOps Proxy supports GET and POST requests. For information on how to use POST requests then check out the documentation here.

Scrapy users can likewise simply pass the proxy details via the meta object.


# ...other scrapy setup code
start_urls = ['https://httpbin.org/ip']
meta = {
"proxy": "http://scrapeops:YOUR_API_KEY@residential-proxy.scrapeops.io:8181"
}

def parse(self, response):
# ...your parsing logic here
yield scrapy.Request(url, callback=self.parse, meta=meta)

Scrapy & SSL Certificate Verification

Note: Scrapy skips SSL verification by default so you don't need to worry about switching it off.


Response Formats

The ScrapeOps Residential Proxy Aggregator returns the response returned by the target URL you request.

This response could be in HTML, JSON, XML, etc. format depending on the response returned by the websites server.

Example response:


<html>
<head>
...
</head>
<body>
...
</body>
</html>

The response will contain the HTML, etc. response and any headers (Note: cookies aren't returned.)


Status Codes

The ScrapeOps Residential Proxy Aggregator will return the status code returned by the target website.

However, if the proxy port will return the following status codes if there are specific errors in your request:

Status CodeBilledDescription
400NoBad request. Either your url or query parameters are incorrectly formatted.
401NoYou have consumed all your credits. Either turn off your scraper, or upgrade to a larger plan.
403NoEither no api_key included on request, or api_key is invalid.

Here is the full list of status codes the Proxy Port returns.


Advanced Functionality

To enable advanced proxy functionality when using the Residential Proxy endpoint you need to pass parameters by adding them to username, separated by periods.

For example, if you want to enable Country Geotargeting with a request, the username would be scrapeops.country=us.


curl -x "http://scrapeops.country=us:YOUR_API_KEY@residential-proxy.scrapeops.io:8181" "https://httpbin.org/ip"

The Residential Proxy Aggregator will accept the following parameters:

ParameterDescription
countryMake requests from specific country. Example: country=us

Check out this guide to see the full list of advanced functionality available.

BETA

As the Residential Proxy Aggregator is in beta not all functionality has been enabled at the moment. More advanced functionality will be publically enabled over the coming weeks. If you require specific functionality then just let our support team know and we can enable it for your account.


Timeout

The ScrapeOps proxy keeps retrying a request for up to 2 minutes before returning a failed response to you.

To use the Proxy correctly, you should set the timeout on your request to a least 2 minutes to avoid you getting charged for any successful request that you timed out on your end before the Proxy Port responded.