POST/PUT Requests
The ScrapeOps Proxy API Aggregator supports sending POST and PUT requests in order to scrape forms or API endpoints.
When scraping POST and PUT endpoints, you need to figure out what data and headers are required to be sent with the request to give the correct response.
Scraping POST and PUT endpoints can be more tricky as it is normally easier for them to block requests as it is highly dependent on whether the data and headers you are sending contain all the correct information.
POST Requests
Simply send the POST request as normal, specifying the Content-Type
in the headers along with adding the query parameter keep_headers=true
to your request so that the API knows to use the custom header you have defined. For more info on custom headers check out the docs here.
# For JSON data
curl -H 'Content-Type: application/json' \
'{foo=bar}' \
POST \
"https://proxy.scrapeops.io/v1/?api_key=YOUR_API_KEY&url=http://httpbin.org/anything&keep_headers=true"
# For form data
curl -H 'Content-Type: application/x-www-form-urlencoded' \
'foo=bar' \
POST \
"https://proxy.scrapeops.io/v1/?api_key=YOUR_API_KEY&url=http://httpbin.org/anything&keep_headers=true"
Here is an example of how to send POST
requests to the Proxy API using Python Requests:
import requests
from urllib.parse import urlencode
API_KEY = 'YOUR_API_KEY'
def get_scrapeops_url(url):
payload = {'api_key': API_KEY, 'url': url, 'keep_headers': True}
proxy_url = 'https://proxy.scrapeops.io/v1/?' + urlencode(payload)
return proxy_url
url = 'http://httpbin.org/anything'
response = requests.post(get_scrapeops_url(url), json={'key': 'value'}, headers={'Content-Type': 'application/json'})
response.json()
PUT Requests
Simply send the PUT request as normal, specifying the Content-Type
in the headers along with adding the query parameter keep_headers=true
to your request so that the API knows to use the custom header you have defined. For more info on custom headers check out the docs here.
# For JSON data
curl -H 'Content-Type: application/json' \
'{foo=bar}' \
PUT \
"https://proxy.scrapeops.io/v1/?api_key=YOUR_API_KEY&url=http://httpbin.org/anything&keep_headers=true"
# For form data
curl -H 'Content-Type: application/x-www-form-urlencoded' \
'foo=bar' \
PUT \
"https://proxy.scrapeops.io/v1/?api_key=YOUR_API_KEY&url=http://httpbin.org/anything&keep_headers=true"
Here is an example of how to send PUT
requests to the Proxy API using Python Requests:
import requests
from urllib.parse import urlencode
API_KEY = 'YOUR_API_KEY'
def get_scrapeops_url(url):
payload = {'api_key': API_KEY, 'url': url, 'keep_headers': True}
proxy_url = 'https://proxy.scrapeops.io/v1/?' + urlencode(payload)
return proxy_url
url = 'http://httpbin.org/anything'
response = requests.put(get_scrapeops_url(url), json={'key': 'value'}, headers={'Content-Type': 'application/json'})
response.json()