Skip to main content

Custom Cookies

The ScrapeOps Proxy API Aggregator enables you to use your own custom cookies when making requests instead of the optimized one we attach to the request.

You can use your own custom cookies by adding custom_cookies parameter along with the cookies you want to send on your request. From here the API will use any custom cookies you have set in your request.

This functionality is useful if the data the website responds with is dependent on the specific cookies, or you need to maintain session cookies between requests.

Currently, the API supports the following cookie attributes:

  • name (required)
  • value (required)
  • domain (optional)
  • path (optional)
  • expires (optional)

You need to separate each attribute with , and each cookie with ;.

Here is an example cookie string where we send two cookies name_1 and name_2:


'name_1=cookie_1,domain=example.com;name_2=cookie_2,domain=example.com;'

To ensure, your cookies are correctly parsed by our API you should always safe encode the cookie string. The above cookie string would be encoded as:


'name_1%3Dcookie_1%2Cdomain%3Dexample.com%3Bname_2%3Dcookie_2%2Cdomain%3Dexample.com%3B'

Here is documentation on how to encode strings in various programming languages.

From here you just add this cookie string to your API request using the custom_cookies parameter.


curl -k "https://proxy.scrapeops.io/v1/?api_key=YOUR_API_KEY&url=http://httpbin.org/cookies?json&custom_cookies=name_1%3Dcookie_1%2Cdomain%3Dexample.com%3Bname_2%3Dcookie_2%2Cdomain%3Dexample.com%3B"

The following is some example Python code of how to send custom cookies to the proxy API:


import requests
from urllib.parse import urlencode

proxy_params = {
'api_key': 'YOUR_API_KEY',
'url': 'http://httpbin.org/cookies?json',
'custom_cookies': 'name_1=cookie_1,domain=example.com;name_2=cookie_2,domain=example.com;',
}

response = requests.get(
url='https://proxy.scrapeops.io/v1/',
params=urlencode(proxy_params),
timeout=120,
)

print('Body: ', response.content)