Skip to main content

Asocks Residential Proxies Guide

Asocks Residential Proxies: Web Scraping Guide

Asocks is a proxy service provider offering residential proxies from over 7 million IP addresses across 150 locations worldwide. One of its key features is "pay as you go" pricing, where you only pay for the traffic used, rather than specific IPs or locations.

Today we'll walk through the process of getting started with Asocks and see exactly what it takes to get going.


TLDR: How to Integrate Asocks Residential Proxy?

Connecting to Asocks is pretty simple.

  1. First, you need to create a proxy port from their dashboard.
  2. Once you've done that, you can retrieve your username, password and ip_address_and_port.
  3. Save them all to a config.json file like you see below.
{
"username": "your-asocks-username",
"password": "your-asocks-password",
"ip_address_and_port": "{your-asocks-ip-address}:{your-asocks-port-number}"
}

To test it, you can use the following code.

import requests
import json
from bs4 import BeautifulSoup
from urllib.parse import urlencode

username = ""
password = ""
ip_address_and_port = ""

with open("config.json", "r") as config_file:
config = json.load(config_file)
username = config["username"]
password = config["password"]
ip_address_and_port = config["ip_address_and_port"]


proxies = {
"http": f"http://{username}:{password}@{ip_address_and_port}",
"https": f"http://{username}:{password}@{ip_address_and_port}",
}

if __name__ == "__main__":

url = "https://api.ipify.org?format=json"

response = requests.get(url)

ip = response.json()["ip"]

print("real ip address:", ip)

proxied_response = requests.get(url, proxies=proxies)

proxied_ip = proxied_response.json()["ip"]

print("proxy ip address:", proxied_ip)
  • The code above uses a dict object to hold your proxy settings.
  • requests.get(url, proxies=proxies) assigns our proxy settings to our Python Requests client.
  • We then ping the ipify API to see our real IP address and our proxied one.

Understanding Residential Proxies

Residential proxies are IP addresses assigned by Internet Service Providers (ISPs) to homeowners. These proxies make your web requests appear as though they’re coming from a real residential user, not a data center or proxy network.

This makes them highly reliable for bypassing geo-restrictions and avoiding detection on websites that use sophisticated anti-scraping measures.

Why Are Residential Proxies Important?

Residential proxies are incredibly important when it comes to scraping the web. When you use a service like the ScrapeOps Proxy Aggregator (with or without customization), the proxy provider first tries to get your content using only a datacenter proxy. If that fails, most providers will switch to a residential or mobile IP address in order to retrieve your content.

Asocks boasts a very large pool of residential IP addresses. When we connect to Asocks, we have to manually set up each of our proxies. In the coming sections, we'll dive deeper into the processes of signing up, creating our frist proxy, and hooking up to a proxy port.

Types of Residential Proxies

There are two main types of residential proxies: Rotating and Static.

  1. Rotating Residential Proxies: These proxies rotate the IP address with each request or after a set time, drawing from a large pool of residential IPs.

Pros:

  • Anonymity: With a constantly changing IP address, it’s difficult for websites to track or block the user.
  • Scalability: Ideal for web scraping large websites where multiple requests are needed without getting flagged.
  • Better for large data collection: Perfect for scraping thousands of pages or accessing restricted content since the IPs rotate frequently.

Cons:

  • Inconsistency: Because the IPs change constantly, some websites may flag or block unfamiliar IPs.
  • Potential disruptions: If IPs rotate too frequently, sessions might get interrupted, causing issues with logins or tracking across multiple requests.
  1. Static Residential Proxies: Provide a consistent IP address over time, mimicking a real user consistently accessing a site from the same location.

Pros:

  • Stability: The constant IP allows for seamless sessions without the need to log in repeatedly.
  • Ideal for long-term tasks: Good for scenarios where maintaining a single IP is important, such as account management or accessing location-specific content over time.

Cons:

  • Higher risk of blocking: Since the IP remains static, websites may detect and block it after repeated use.
  • Less anonymity: The consistent use of the same IP may make the user more vulnerable to tracking or detection by anti-bot measures.

Here’s a breakdown of their features:

FeatureRotating Residential ProxiesStatic Residential Proxies
IP AddressChanges with each requestRemains the same for an extended period
AnonymityHighModerate
SpeedSlower due to rotation processGenerally faster
ManagementComplexSimpler
Risk of DetectionLowerHigher

Residential vs. Datacenter Proxies

Residential proxies are frequently compared to datacenter proxies, which are IP addresses supplied by datacenters instead of physical devices.

Data centers provide large pools of IP addresses used by businesses for various services. While these proxies are fast and cost-effective, they are easier to detect and block because their traffic patterns are often seen as unnatural or automated.

Pros of Residential Proxies:

  • Seen as real users by websites, reducing the risk of blocks.
  • Access to IPs from specific countries, cities, or regions, ideal for geo-sensitive tasks.
  • Better for bypassing CAPTCHAs and accessing content restricted by location.

Cons of Residential Proxies:

  • More expensive due to their higher legitimacy and trustworthiness.
  • Slower than data center proxies, as residential networks can be less optimized.

Pros of Data Center Proxies:

  • Typically faster since they use high-performance data center infrastructure.
  • Generally cheaper than residential proxies.

Cons of Data Center Proxies:

  • Websites often flag and block data center IPs, especially for tasks like web scraping.
  • More likely to be recognized as a bot or suspicious traffic by sophisticated anti-scraping systems.

Here's a comparison of Residential and Data Center proxies to better understand the differences:

FeatureResidential ProxiesData Center Proxies
IP SourceIPs assigned by ISPs to real householdsIPs from cloud service providers or data centers
AnonymityHigh anonymity; difficult for websites to detect and blockEasier to detect and block due to unnatural traffic patterns
CostMore expensive due to high trust and reliabilityGenerally cheaper than residential proxies
SpeedSlower, as they rely on real user networksFaster due to data center infrastructure
ReliabilityHigher reliability for long-term tasks or sensitive scrapingGood for non-sensitive tasks but may face blocks quickly
Use CaseIdeal for geo-restricted content, ad verification, web scrapingSuitable for tasks with lower security needs or temporary use
Blocking RiskLower risk due to the residential nature of the IPHigher risk as data center IPs are easily detected and blocked

While residential proxies offer higher trust and reliability, data center proxies are faster and more affordable but come with higher risks of detection and blocking.

When Are Residential Proxies Useful?

Residential proxies offer high anonymity and reliability, making them essential for tasks that require bypassing strict security measures or accessing geo-restricted content.

Some of the use cases where residential proxies are handy:

  • Web Scraping and Data Collection: Perfect for scraping data from large, sophisticated websites with anti-bot systems, such as e-commerce platforms (e.g., Amazon or eBay).
  • SEO and SERP Analysis: Useful for monitoring search engine results specific to different regions and improving localized SEO strategies.
  • Social Media Monitoring: Helps manage multiple accounts, collect social media data, and bypass restrictions imposed on automated tools by platforms like Instagram or Facebook.
  • Ad Verification: Allows advertisers and agencies to verify that ads are being displayed correctly to users in different locations or across different devices.
  • Accessing Geo-Restricted Content: Ideal for bypassing location-based content restrictions on streaming services or websites that limit access based on a user's region.

Why Use Asocks Residential Proxies?

Asocks gives us a very large set of IP addresses to work with. According to their site, they give us access to 150+ locations, over 7 million IP addresses, and they've had over 100,000 satisified customers. You can verify this in the screenshot below.

Asocks Homepage

This giant pool of IP addresses gives us a super reliable connection and also gives us the ability to narrow our IP down to a specific city if we want.

With most standard proxy services, we can get geolocation suppport, but it is typically limited to a specific country. With Asocks, we can choose our country, state and city.


Asocks Residential Proxy Pricing

Asocks has a very straightforward pricing model. With Asocks, you pay $3/GB as you go. Check out the screenshots below. The first shows the price at 1GB. The second shot shows the price for 1TB. We'll go through this process and find out if it's as good as it sounds.

1 GB Pay as you go

1 TB Pay as you go

Pricing Table

While the pricing is all the same, we'll lay it out in a table for you to view below.

BandwidthPrice
1 GB$3
120 GB$360
320 GB$960
425 GB$1275
602 GB$1806
724 GB$2172
850 GB$2550
1 TBLess than $3000

Pricing Comparison

Generally speaking, when proxy providers offer plans around $2-3 per GB, they are considered cheap. If they offer smaller plans in the $6-8 per GB range, they are more expensive.

If you'd like to shop around for other residential providers, we built a tool for that here.

Here's what our page looks like. You can compare proxies using all sorts of different criteria.

Proxy Comparison Tool


Setting Up Asocks Residential Proxies

Creating an account with Asocks is pretty easy. You can create an account using just an email/password, or you can choose to create an account with Google. The things that come afterward are somewhat of a pain to deal with though.

For starters, there is no simple way to sign up for a free trial. They promise a free 1 GB credit to your account if you link your Telegram to their bot. However, in our case, this did not work.

When opening on desktop, this is what I received... I was unable to even speak to the bot on Telegram.

Broken Free Trial

I did try on mobile since I was using Ubuntu on desktop, maybe there was a bit of a compatibility issue. Here is their mobile dashboard with the Telegram bonus link.

Asocks Mobile

After clicking the link, I was taken to a chat with their bot, however, I was never given a promo code. You can view evidence of this below.

Asocks Telegram Chatbot

As you can see in the image above, there is an option to open the form. Upon doing that, I received the following screen prompting me to enter my promo code, but they never gave me a promo code!!!

Promo Code

After about an hour of time spent trying to get the Telegram bot to work, I finally gave up and decided to buy 1 GB of service. However, this too comes with a bit of a catch: Asocks has a minimum deposit of $15.

On their balance tab, there is no way to withdraw money either, so any money you do send to your account with them is apparently stuck there indefinitely until you use it for bandwidth.

Topping up your account is something of a stressful process as well. As you can see, they do accept payments through a list of different providers.

Payment Processors List

This comes with a bit of a catch as well though, many of their payment processors do not provide clear English pages to use.

On top of using an unreputable payment processor, I'm being prompted to enter my credit card information on a page I can't even read. Because I've never heard of many of these providers, I decided to ask ChatGPT for some background information them. Here is the response I received.

Based on the image, here are some details about the listed payment processors:

CloudPayments: A well-known payment platform mainly used in Russia and CIS countries for accepting online payments. It appears legitimate, and many businesses use it, but it's advisable to check local regulations or restrictions if you plan on using it.

NihaoPay: This is a legitimate global payment gateway, primarily for Chinese consumers, offering payment solutions for businesses that need to accept payments from China. It's generally reliable but may have geographic limitations.

USDT (TRC-20): This refers to the Tether stablecoin on the TRON blockchain. Tether is widely used and recognized as legitimate, but be cautious of the platform offering the wallet or exchange services.

BTC: Bitcoin is, of course, a well-known and widely accepted cryptocurrency. However, ensure that the exchange or wallet you're using is trustworthy.

Enot: This one is less well-known. There have been some mentions online, but you should exercise caution and check for recent reviews or information about its legitimacy.

PayPal: This is likely mimicking PayPal (given the logo), but with a slightly altered name, "PayPalych," it could be an attempt to spoof or create confusion. I would advise avoiding this option.

Upon reading this, I elected to simply pay with Bitcoin. With BTC, I can simply send the BTC, and I don't need to give out my credit card information. If you've ever paid with Bitcoin before, it's a relatively straightforward process.

  1. When you elect to pay with Bitcoin, you are brought to the screen below.

Pay with Bitcoin

  1. Then you open up your BTC wallet, and copy the payment address you are given on the screen above.

  2. Choose to send from your BTC wallet, and paste the BTC address from the screen above into the to or receiver section. You can see an example using a blank wallet below. I used Exodus for the example because it's a pretty common Bitcoin wallet.

  3. Once you send your payment, you need to wait for it to confirm. You'll see a screen similar to the one below.

Wait for Confirmation

  1. After your payment has been confirmed (this can take anywhere from 10 to 30 minutes), your balance will be credited the dollar value of your Bitcoin deposit.

However, as far as crypto processing goes, this definitely feels like legacy technology. There are many more advanced crypto gateways out there that not only look more modern, but confirm our payment quicker.

  1. You can see our credited balance in the Balance section of the screenshot below.

Balance Confirmed

The initial setup process with Asocks is not user friendly at all.

While the Bitcoin payment option is definitely a safe way to do this, it seems like a company as large as Asocks could at least give us the option to pay using either PayPal, Coinbase Commerce (for crypto, it is much faster and more convenient than waiting for 3 Bitcoin confirmations), or something a little more reputable.


Authentication

The Asocks API documentation is available for review here. We get several options when it comes to authentication. For starters, you can whitelist your IP address by clicking the Address whitelist tab as you see below.

Address Whitelist

With the whitelist, we can setup a basic proxy connection and make requests without having to use our API key for authentication.

However, if you wish really do anything with their proxy API, an API key is still required.

To begin making requests, we need to setup a proxy port. While they don't have extensive documentation like we do here at ScrapeOps, they do give us a port builder that allows us to create customized ports.

For starters, choose the type of IP addresses you want.

Create a new port

Next, you can click the Countries tab and choose a country. They allow you to browse a list, or type your country into a search bar.

Choose country

Depending on your country of choice, you can also choose a Region and a City.

In the image below, you can see a list of cities in Michigan. Each city has a number of bars next to it indicating the quality of the proxy connection.

You should notice that Detroit is the only city with any bars, and even that connection is not at full strength.

Choose city


Basic Request Using Asocks Residential Proxies

Here, we'll make a simple request using their API. After we've made or first proxy port, we need to save the Login, password and the ip_address_and_port. Connecting to Python requests via proxy port is very simple.

import requests
import json
from bs4 import BeautifulSoup
from urllib.parse import urlencode

username = ""
password = ""
ip_address_and_port = ""

with open("config.json", "r") as config_file:
config = json.load(config_file)
username = config["asocks"]["spain"]["username"]
password = config["asocks"]["spain"]["password"]
ip_address_and_port = config["asocks"]["spain"]["ip_address_and_port"]


proxies = {
"http": f"http://{username}:{password}@{ip_address_and_port}",
"https": f"http://{username}:{password}@{ip_address_and_port}",
}

if __name__ == "__main__":

url = "https://api.ipify.org?format=json"

response = requests.get(url)

ip = response.json()["ip"]

print("real ip address:", ip)

proxied_response = requests.get(url, proxies=proxies)

proxied_ip = proxied_response.json()["ip"]

print("proxy ip address:", proxied_ip)

The code above assumes we've saved our proxy information in a config file. If you want to code your proxy information in directly, you can (although this is not considered best practice).

  • Proxy ports are laid out like this: "http://{username}:{password}@{ip_address_and_port}".
  • To use these ports with Python, we simple create a dict of our proxies, and pass in the proxies kwarg when making our requests, requests.get(url, proxies=proxies).
  • The code above checks our IP address without a proxy, and then prints it.
  • Then, the code checks the IP address of our proxied connection.

When we run it, we get this output below. As you can see, we have two separate IP addresses.

IP Check

The proxy was created in Spain, so we'll head over to https://ipinfo.io/ to verify this.

First, we'll check our non-proxied IP address.

IP Info Homepage

Now, let's take a look at the proxied IP.

IP Info Homepage

As you can see above, our Spanish Proxy is showing up in Madrid, Spain. Everything is working as intended.


Country Geotargeting

Country-level geotargeting allows users to select IP addresses from specific countries, making it appear as if they are browsing from that region.

Residential proxies use real IPs from legitimate households, so websites view the traffic as coming from actual users in the targeted country.

This is particularly useful for market research, competitor analysis, or social media monitoring. For instance, a business might want to scrape data from a specific country's e-commerce website or access content only available in that country.

Top 5 Countries Supported by Asocks

ASocks supports proxies from over 150 countries, making it versatile for a wide range of geolocation needs. While its proxy pool includes more than 7 million IPs, the most popular locations include the Russia, India, and Western Europe.

Below is a table showcasing 5 most popular countries supported by Asocks, along with the number of IPs available in each:

CountryNumber of IPs
Russia644,716
India609,654
Kazakhstan93,092
United States58,233
Ukraine51,408

Using Country-Specific Proxies

If you followed along earlier, you already know that we have to select a country when creating our port. You can choose pretty much any location you want. Once again, take a look at the Countries tab.

Choose country

In order to make requests, you need to set up a port. In order to create a port, you have to choose a country. There is no way around this.


City Geotargeting

City geotargeting allows users to specify an IP address from a particular city rather than just a country or region.

This level of precision enables businesses and individuals to simulate internet browsing from a specific city, enhancing the targeting accuracy for tasks like localized testing, marketing, and data collection.

Using City-Specific Proxies

As mentioned before, we can also choose a city for our proxy port. First, we need to create a new port. This time, after selecting a country, go to the Cities tab.

You can view an example below. We're going to use Philadelphia.

Chooce City

Once you've created the proxy, remember to save your username, password, and ip_address_and_port to either your config file or directly inside your script.

The example below performs the same test we did earlier, but with our new proxy port that has been set to Philadelphia.

import requests
import json
from bs4 import BeautifulSoup
from urllib.parse import urlencode

username = ""
password = ""
ip_address_and_port = ""

with open("config.json", "r") as config_file:
config = json.load(config_file)
username = config["asocks"]["philadelphia"]["username"]
password = config["asocks"]["philadelphia"]["password"]
ip_address_and_port = config["asocks"]["philadelphia"]["ip_address_and_port"]


proxies = {
"http": f"http://{username}:{password}@{ip_address_and_port}",
"https": f"http://{username}:{password}@{ip_address_and_port}",
}

if __name__ == "__main__":

url = "https://api.ipify.org?format=json"

response = requests.get(url)

ip = response.json()["ip"]

print("real ip address:", ip)

proxied_response = requests.get(url, proxies=proxies)

proxied_ip = proxied_response.json()["ip"]

print("proxy ip address:", proxied_ip)

Here is our new output.

IP City Check Terminal

Now, we just need to ensure that our location is correct.

IPInfo Verification

So, their city selection does not work as intended. As you saw earlier, the location we selected was Philadelphia. However, our IP address actually put our location in Homestead, Florida. This is 1680 km (1044 miles) away from where we want to be!


How to Use Static Proxies

Static proxies are proxies that provide a fixed IP address, meaning the IP does not change for the duration of the session. That's why they're called as "Sticky Sessions". When using a static proxy, we keep our current IP address for continued use of the browsing session.

With static proxies, users appear to be consistently accessing the internet from the same location, making them ideal for tasks requiring stable, uninterrupted connections.

Key Benefits of Static Proxies

There are certain benefits of using static proxies.

  • Consistency: The same IP address is maintained throughout your session, making them ideal for activities that require stable IPs.
  • IP Whitelisting: They work well for services that rely on a trusted, unchanging IP.
  • Bypassing Geo-restrictions: Static proxies can be used to access geo-blocked content without changing locations.
  • Session Persistence: Great for tasks like managing multiple accounts or scraping data that needs consistent session information.

Common Use Cases for Static Proxies

Static proxies are useful for tasks requiring consistent IP addresses to avoid detection and ensure reliable access.

  • Social Media Management: Maintaining multiple accounts without triggering suspicious activity.
  • Ad Verification: Ensuring that ads appear correctly in specific regions.
  • Web Scraping: Gathering data while maintaining a consistent IP to avoid detection.
  • SEO Monitoring: Tracking rankings from a consistent location without changing IP addresses.
  • Accessing Geo-restricted Content: Bypassing regional restrictions without rotating IPs.
  • Account Creation: Ensuring reliability when creating multiple accounts for services.

Example of Using Static Proxies with Python

With Asocks, all of our proxies are Static/Sticky sessions. After 15 minutes, the session does expire. After expiration, we recieve a new IP, but until the expiration, we reuse our IP address.

The outputs below are from our original port that we created in Spain. As you can see, on both runs, we have the same proxy address.

IP Address Terminal

Now, we just want to verify that this is a Spanish IP address.

IP Info Verificaiton

As you can see above, we keep our same location over multiple requests and our proxy location still shows up in Spain. Without using a specific city, everything does seem to be working correctly.


Error Codes

In their documentation, Asocks does not provide a list of status codes. However, we'll do our best to explain some common status codes you might expect from a subscription provider.

Thanks to Mozilla's web development standards, we put together a table of the most common status codes we run into when building things on the web.

Status CodeDescription
200Everything worked!
400Bad Request (you may need to rewrite your request)
401Unauthorized
402Payment Required
403Forbidden Request
404Page Not Found
408Request Timed Out
422Unprocessable Content
429Too Many Requests
500Internal Server Error
501Method Not Implemented (doesn't exist)
502Bad Gateway
503Service Unavailable
504Gateway Timeout

The list above is non-exhaustive, but will give you a good cheat sheet for different status codes. You can view the full list of Web Development Status Code standards from Mozilla here.


KYC (Know Your Customer) Verification

KYC verification can be a huge pain for many people in the development community... particularly for proxy and crypto service providers. The KYC process for Asocks is actually pretty easy. So far, it's one of the only painless things we've had to deal with.

When signing up for Asocks, all you need to do is create an account. With several other providers, you need to go through an extensive KYC process which essentially a background check (and sometimes even a video interview!) in order to use their services.

The KYC process can be both lengthy and invasive to your privacy. Asocks does not make new subscribers undergo stringent KYC verification.

If you'd like to more about the KYC policies of other providers, you can get a good feel for BrightData's KYC process here.


Implementing Asocks Residential Proxies in Web Scraping

Using Asocks requires a proxy port integration, so it isn't inherently available using all services. Earlier, we integrated an Asocks Proxy Port with Requests.

Here is a more complete list of ways to integrate with an Asocks Proxy Port.

Python Requests

proxies = {
"http": f"http://{username}:{password}@{ip_address_and_port}",
"https": f"http://{username}:{password}@{ip_address_and_port}",
}

requests.get("https://quotes.toscrape.com", proxies=proxies)

Python Selenium

Authenticated proxies require an API key (or something similar). Every proxy service that charges money is authenticated.

Selenium doesn't support authenticated proxies. You can technically do it with SeleniumWire, but SeleniumWire has been deprecated.

Long story short, doing so creates security and compatibility problems and it will only get worse over time.

Python Scrapy

import scrapy

class ProxySpider(scrapy.Spider):
name = "proxy_spider"

def start_requests(self):
url = "http://quotes.toscrape.com/"

# Proxy with authentication
proxy = f"http://{username}:{password}@{ip_address_and_port}"

yield scrapy.Request(url=url, callback=self.parse, meta={"proxy": proxy})

def parse(self, response):
print("Response:", response.text)

NodeJs Puppeteer

const puppeteer = require('puppeteer');

(async () => {
const browser = await puppeteer.launch({
headless: false, // Set to true for headless mode
args: [
'--proxy-server=your_proxy_ip:your_proxy_port' // Set your proxy server here
]
});

const page = await browser.newPage();

// Handle proxy authentication
await page.authenticate({
username: 'your_username', // Proxy username
password: 'your_password' // Proxy password
});

// Go to a website to check IP
await page.goto('https://api.ipify.org?format=json');

const content = await page.content();
console.log(content);

await browser.close();
})();

NodeJs Playwright

const { chromium } = require('playwright');

(async () => {
const browser = await chromium.launch({
headless: false, // Set to true for headless mode
proxy: {
server: 'http://your_proxy_ip:your_proxy_port', // Proxy server with authentication
username: 'your_username', // Proxy username
password: 'your_password' // Proxy password
}
});

const context = await browser.newContext();
const page = await context.newPage();

// Go to a website to check IP
await page.goto('https://api.ipify.org?format=json');

const content = await page.content();
console.log(content); // This will show the response content

await browser.close();
})();


Case Study - Scraping Amazon.es

In this case study, we'll show why web scraping eCommerce sites is important for targeting specific geographic regions. For example, if you're in Portugal, you might see different products compared to someone in Spain.

Therefore, let’s scrape Amazon using Portugal and Spain geo-locations and compare the results. To achieve this, we'll utilize Asocks' residential proxies with Python.

We'll try retrieving the page with both a Portugese IP address and a Spanish one. Take a look at our code below. It's pretty simple.

  1. First, we connect to a proxy port in Portugal and write the response from Amazon.
  2. Next, we connect with the Spanish port we created at the beginning of the article.
  3. After receiving each response, we write it to an HTML file so we can look at the differences in our browser.

Here is our code.

import requests
import json
from bs4 import BeautifulSoup
from urllib.parse import urlencode

username = ""
password = ""
ip_address_and_port = ""

with open("config.json", "r") as config_file:
config = json.load(config_file)
spanish_username = config["asocks"]["spain"]["username"]
spanish_password = config["asocks"]["spain"]["password"]
spanish_ip_address_and_port = config["asocks"]["spain"]["ip_address_and_port"]
portuguese_username = config["asocks"]["portugal"]["username"]
portuguese_password = config["asocks"]["portugal"]["password"]
portuguese_ip_address_and_port = config["asocks"]["portugal"]["ip_address_and_port"]


spanish_proxies = {
"http": f"http://{spanish_username}:{spanish_password}@{spanish_ip_address_and_port}",
"https": f"http://{spanish_username}:{spanish_password}@{spanish_ip_address_and_port}",
}

portuguese_proxies = {
"http": f"http://{portuguese_username}:{portuguese_password}@{portuguese_ip_address_and_port}",
"https": f"http://{portuguese_username}:{portuguese_password}@{portuguese_ip_address_and_port}"
}

if __name__ == "__main__":

headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36"
}

url = "https://www.amazon.es/s?k=portátil"

portuguese_response = requests.get(url, proxies=portuguese_proxies, headers=headers)

with open("portuguese.html", "w") as file:
file.write(portuguese_response.text)

spanish_response = requests.get(url, proxies=spanish_proxies, headers=headers)

with open("spanish.html", "w") as file:
file.write(spanish_response.text)

The code above performs a simple search for laptops on Amazon.es. It then saves each page to an HTML file. You can view screenshots of both the Portuguese and Spanish results below.

Here is the site when accessed from Portugal.

Amazon.es Porguguese

Amazon.es Porguguese

Here are some shots from using our Spanish proxy.

Amazon.es Spanish

Amazon.es Spanish


Alternative: ScrapeOps Residential Proxy Aggregator

As you've noticed in this article, Asocks is a pain to deal with.

ScrapeOps Residential Proxy Aggregator provides access to the top 20 residential proxy providers through a single port, ensuring a high success rate for web scraping tasks. It automatically switches proxies to avoid blocks, optimizing performance and cost with flexible pricing plans.

  • Access to the top 20 residential proxy providers, including Smartproxy, Bright Data, and Oxylabs.
  • 98% success rate due to automatic proxy switching.
  • Bypasses anti-bot measures and avoids blocks.
  • Optimizes performance and cost by monitoring proxy performance and pricing.
  • Flexible pricing plans starting at $15 per month, with up to $999 for higher usage.
  • 500 MB of free bandwidth credits to start.

There are a number of places we beat Asocks as far as service and plans.

AspectAsocksScrapeOps
Signup$3/GB(min purchase 5 GB)500 MB free
Country TargetingYes (required)Optional
City TargetingYes(unreliable see here)No
Smallest Paid Plan$15 for 5GB ($3/GB)$15 for 3GB ($5/GB)
Mid Grade Plain$150 for 50 GB ($3/GB)$149 for 50GB ($2.98/GB)
Large Plan$1500 for 500 GB ($3GB)$999 for 500GB ($1.99/GB)
Payment OptionsUnreputable Gateways, BTC, USDT (TRC20)Stripe

Let's take a look at our simple IP checker example with ScrapeOps.

import requests
import json
from bs4 import BeautifulSoup
from urllib.parse import urlencode

API_KEY = ""
with open("config.json", "r") as config_file:
config = json.load(config_file)
API_KEY = config["scrapeops_api_key"]


proxies = {
"http": f"http://scrapeops:{API_KEY}@residential-proxy.scrapeops.io:8181",
"https": f"http://scrapeops:{API_KEY}@residential-proxy.scrapeops.io:8181",
}

if __name__ == "__main__":

url = "https://api.ipify.org?format=json"

response = requests.get(url)

ip = response.json()["ip"]

print("real ip address:", ip)

proxied_response = requests.get(url, proxies=proxies)

proxied_ip = proxied_response.json()["ip"]

print("proxy ip address:", proxied_ip)

You can view the results below.

alt text


Asocks does not mention their proxies as begin ethically sourced. ScrapeOps' providers (especially BrightData) tend to use ethically sourced proxies. When using Asocks Residential Proxies, you need to make sure that you're scraping data legally and also that you're following the laws applicable to the information you're scraping.

Asocks does have a list of prohibited websites. If you look at their FAQ section here and click on Limitations, you will get a list of sites that you're prohibited from accessing via their proxy service.

Anytime you scrape a website, you need to pay attention to their Terms of Service and their robots.txt. If you unsure about the legality of a scraping job, remember that public data is generally legal to scrape.

Private data (data gated behind a login), is a completely different story. If you are still unsure, consult an attorney.


Conclusion

Residential proxies are an excellent way to scrape the web. They can provide far more reliable access than a typical datacenter IP address.

Now that you know how to integrate with a proxy port, go ahead and build something using a residential proxy and take advantage of our 500MB of free bandwidth here at ScrapeOps!

If you'd like to know more about any of the tools or frameworks used in this article, you can find their documentation here:


More Web Scraping Guides

Now that you've gotten your feet wet with each of these tools, go build something!

If you're in the mood to binge read, check our extensive Python Web Scraping Playbook or if you want to learn more about residential proxies, take a look at the articles below.