Skip to main content

Fake User-Agent API

ScrapeOps Fake User-Agent API is an easy to use user-agent generating tool that returns returns a list of fake user-agents that you can use in your web scrapers to bypass some simple anti-bot defenses.


{
"result": [
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.5 Safari/605.1.15",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 Safari/603.3.8",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Safari/605.1.15",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36"
]
}


Authorisation - API Key

To use the ScrapeOps Fake User-Agent API, you first need an API key which you can get by signing up for a free account here.

Your API key must be included with every request using the api_key query parameter otherwise the API will return a 403 Forbidden Access status code.

The API is free to use, however, this API key is used to prevent users misusing the API endpoint.


How To Retrieve Fake User-Agents From The API?

To use the ScrapeOps Fake User-Agents API you just need to send a request to the API endpoint to retrieve a list of user-agents.


http://headers.scrapeops.io/v1/user-agents?api_key=YOUR_API_KEY

Here is an example using Python Requests:


import requests

response = requests.get('http://headers.scrapeops.io/v1/user-agents?api_key=YOUR_API_KEY')

print(response.json())

Example response from the API:


{
"result": [
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.5 Safari/605.1.15",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 Safari/603.3.8",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Safari/605.1.15",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 10.0; Windows; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.114 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.53 Safari/537.36"
]
}


Scraper Integration

To integrate the Fake User-Agent API you should configure your scraper to retrieve a batch of the most up-to-date user-agents when the scraper starts and then configure your scraper to pick a random user-agent from this list for each request.

Here is an example Python scraper integration:


import requests
from random import randint

SCRAPEOPS_API_KEY = 'YOUR_API_KEY'

def get_user_agent_list():
response = requests.get('http://headers.scrapeops.io/v1/user-agents?api_key=' + SCRAPEOPS_API_KEY)
json_response = response.json()
return json_response.get('result', [])

def get_random_user_agent(user_agent_list):
random_index = randint(0, len(user_agent_list) - 1)
return user_agent_list[random_index]

## Retrieve User-Agent List From ScrapeOps
user_agent_list = get_user_agent_list()

url_list = [
'https://example.com/1',
'https://example.com/2',
'https://example.com/3',
]

for url in url_list:

## Add Random User-Agent To Headers
headers = {'User-Agent': get_random_user_agent(user_agent_list)}

## Make Requests
r = requests.get(url=url, headers=headers)
print(r.text)

Here the scraper will use a random user-agent for each request.


API Parameters

The following is a list of API parameters that you can include with your requests to customise the header list response.

ParameterDescription
api_keyThis is a required parameter. You can get your Free API key here.
num_results(Optional) By default the API returns a list of 10 user-agents, however, you can increase that number by changing the num_results number. Max is 100 headers.
mobile(Optional) Set to true to get only mobile user agents, or false to get only desktop user agents.
Regular Updates

Our user agent database is updated every 3-6 months to stay in sync with the latest browser versions and operating system releases, ensuring your scraper always uses current and realistic user agents.