Skip to main content

Python Requests Fake Headers Integration

The following are two examples of how to integrate the Fake Browser Headers API and the Fake User-Agent API into your Python Request based web scrapers.


Python Requests Fake Browser Headers API Integration

To integrate the Fake Browser Headers API you should configure your scraper to retrieve a batch of the most up-to-date headers when the scraper starts and then configure your scraper to pick a random header from this list for each request.

Here is an example Python scraper integration:


import requests
from random import randint

SCRAPEOPS_API_KEY = 'YOUR_API_KEY'

def get_headers_list():
response = requests.get('http://headers.scrapeops.io/v1/browser-headers?api_key=' + SCRAPEOPS_API_KEY)
json_response = response.json()
return json_response.get('result', [])

def get_random_header(header_list):
random_index = randint(0, len(header_list) - 1)
return header_list[random_index]


header_list = get_headers_list()

url_list = [
'https://example.com/1',
'https://example.com/2',
'https://example.com/3',
]

for url in url_list:
r = requests.get(url=url, headers=get_random_header(header_list))
print(r.text)


Python Requests Fake User-Agent API Integration

To integrate the Fake User-Agent API you should configure your scraper to retrieve a batch of the most up-to-date user-agents when the scraper starts and then configure your scraper to pick a random user-agent from this list for each request.

Here is an example Python scraper integration:


import requests
from random import randint

SCRAPEOPS_API_KEY = 'YOUR_API_KEY'

def get_user_agent_list():
response = requests.get('http://headers.scrapeops.io/v1/user-agents?api_key=' + SCRAPEOPS_API_KEY)
json_response = response.json()
return json_response.get('result', [])

def get_random_user_agent(user_agent_list):
random_index = randint(0, len(user_agent_list) - 1)
return user_agent_list[random_index]

## Retrieve User-Agent List From ScrapeOps
user_agent_list = get_user_agent_list()

url_list = [
'https://example.com/1',
'https://example.com/2',
'https://example.com/3',
]

for url in url_list:

## Add Random User-Agent To Headers
headers = {'User-Agent': get_random_user_agent(user_agent_list)}

## Make Requests
r = requests.get(url=url, headers=headers)
print(r.text)

Here the scraper will use a random user-agent for each request.


API Parameters

The following is a list of API parameters that you can include with your requests to customise the header list response.

ParameterDescription
api_keyThis is a required parameter. You can get your Free API key here.
num_resultsBy default the API returns a list of 10 user-agents, however, you can increase that number by changing the num_results number. Max is 100 headers.