Skip to main content

POST Requests

The ScrapeOps Proxy API supports sending POST requests in order to scrape forms or API endpoints.

Simply send the POST request as normal, specifying the Content-Type in the headers along with adding the query parameter keep_headers=true to your request so that the API knows to use the custom header you have defined. For more info on custom headers check out the docs here.

# For JSON data
curl -H 'Content-Type: application/json' \
'{foo=bar}' \

# For form data
curl -H 'Content-Type: application/x-www-form-urlencoded' \
'foo=bar' \

Here is an example of how to send POST requests to the our Proxy API using Python Requests:

import requests
from urllib.parse import urlencode


def get_scrapeops_url(url):
payload = {'api_key': API_KEY, 'url': url, 'keep_headers': True}
proxy_url = '' + urlencode(payload)
return proxy_url

url = ''

response =, json={'key': 'value'}, headers={'custom-header': 'hello'})


When scraping POST endpoints, you need to figure out what data and headers are required to be sent with the request to give the correct response.

Scraping POST endpoints can be more tricky as it is normally easier for them to block requests as it is highly dependent on whether the data and headers you are sending contain all the correct information.