Skip to main content

Python aiohttp POST Requests

Python aiohttp: How to Send POST Requests

How to Send POST Requests With Python aiohttp

To send POST requests with Python aiohttp first create a session object using the aiohttp.ClientSession() method and then use the post() method on that session object. Next, add the POST body and Content-Type using the body and headers parameters.


import aiohttp
import asyncio

async def post_request():
async with aiohttp.ClientSession() as session:
response = await session.post(url="https://httpbin.org/post",
data={"key": "value"},
headers={"Content-Type": "application/json"})
print(await response.json())

asyncio.run(post_request())

In this guide for The Python Web Scraping Playbook, we will look at how to make POST requests with the Python aiohttp library.

In this guide we will walk you through the most common ways of sending POST requests with Python aiohttp:

Let's begin...

Need help scraping the web?

Then check out ScrapeOps, the complete toolkit for web scraping.


POST JSON Data Using Python aiohttp

A common scenario for using POST requests is to send JSON data to an API endpoint, etc. Doing this with Python aioHTTP is very simple.

Here we will use Python aiohttp's Session functionality to send POST requests.

We need to use aiohttp.ClientSession() to create a new instance of the ClientSession class. The async with statement is used to create a context manager for the aiohttp.ClientSession() to manage the life cycle of the session object. Next, we make a POST request using the session.post() method. Then, We simply need to add the data to the request using the json parameter of the POST request:


import aiohttp
import asyncio

async def post_request():
url = "https://httpbin.org/post"
data = {"key": "value"}

async with aiohttp.ClientSession() as session:
response = await session.post(url, json=data)
print(await response.json())

asyncio.run(post_request())

The aiohttp library will automatically encode the data as JSON and set the Content-Type header to application/json.

This approach can be simpler and more concise than manually encoding the data and setting the headers. Additionally, it may offer some performance benefits, as the Python aiohttp library can use a more efficient encoding method for JSON data.


POST Form Data Using Python aioHTTP

Another common use case for using POST requests is to send form data to an endpoint.

We simply just need to add the data to the request using the data parameter of the POST request:


import aiohttp
import asyncio

async def post_request():
url = "https://httpbin.org/post"
data = {"key": "value"}

async with aiohttp.ClientSession() as session:
response = await session.post(url, data=data)
print(await response.text())

asyncio.run(post_request())

The aiohttp library will automatically encode the data as JSON and set the Content-Type header to application/x-www-form-urlencoded so you don't have to set any headers.


Configuring Data Types

As we've seen above when you use the data or json parameter to send data with the POST request is defaults the Content-Type header to either application/json or application/x-www-form-urlencoded.

However, if you would like to override this or send data with another Content-Type then you can do so by just adding the Content-Type header to the POST request.

In the following example, we will send JSON data using the data parameter instead of the json parameter as we did previously.


import aiohttp
import asyncio
import json

async def post_request():
url = "https://httpbin.org/post"
data = {"key": "value"}

async with aiohttp.ClientSession() as session:
headers = {"Content-Type": "application/custom-type"}
json_data = json.dumps(data)

response = await session.post(url, data=json_data, headers=headers)
print(await response.json())

asyncio.run(post_request())

---

## More Web Scraping Tutorials \{#more-web-scraping-tutorials}
So that's how you can send POST requests using **Python aiohttp**.

If you would like to learn more about Web Scraping, then be sure to check out [The Web Scraping Playbook](/web-scraping-playbook).

Or check out one of our more in-depth guides:

- [How to Scrape The Web Without Getting Blocked Guide](/web-scraping-playbook/web-scraping-without-getting-blocked)
- [The State of Web Scraping 2020](https://scrapeops.io/blog/the-state-of-web-scraping-2022)
- [The Ethics of Web Scraping](/web-scraping-playbook/ethics-of-web-scraping)