Introduction
Asynchronous programming in Python has become increasingly popular, providing a way to write concurrent code that is more efficient and scalable. The asyncio
module, introduced in Python 3.4, has been at the core of this development, offering a rich set of features for writing asynchronous programs. In this article, we will explore some of the most popular Python libraries that leverage asyncio
to achieve concurrency, providing examples and insights into how they can be used in your projects.
aiohttp
aiohttp
is an asynchronous HTTP client/server framework. It utilizes asyncio
features to provide a powerful tool for making non-blocking HTTP requests and developing asynchronous web applications.
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def main():
async with aiohttp.ClientSession() as session:
html = await fetch(session, 'http://python.org')
print(html)
asyncio.run(main())
See also:
- Python & aiohttp: How to download files using streams
- Python & aiohttp: Sending multiple requests concurrently
- Using aiohttp to make POST requests in Python (with examples)
aioredis
aioredis
is an async library for interacting with Redis databases. It offers full support for asyncio and provides a high-level interface for accessing and manipulating Redis data structures asynchronously.
import asyncio
import aioredis
async def main():
redis = await aioredis.create_redis_pool('redis://localhost')
await redis.set('my-key', 'value')
value = await redis.get('my-key')
print(value)
redis.close()
await redis.wait_closed()
asyncio.run(main())
See also:
- Python aiofiles: How to Read & Write CSV Files Asynchronously
- Python aiofiles: Read & Write files asynchronously
FastAPI
FastAPI
is a modern, fast (high-performance), web framework for building APIs with Python 3.7+ based on standard Python type hints. The key feature of FastAPI is that it is built on top of Starlette for the web parts and uses Pydantic for the data parts, leveraging asyncio for concurrency.
from fastapi import FastAPI
from typing import Optional
app = FastAPI()
@app.get('/')
async def read_root():
return {'Hello': 'World'}
@app.get('/items/{item_id}')
async def read_item(item_id: int, q: Optional[str] = None):
return {'item_id': item_id, 'q': q}
See also:
- FastAPI: How to Change the Response Status Code
- Write Your First Backend API with FastAPI (Hello World)
- Deploying FastAPI on Ubuntu with Nginx and Let’s Encrypt
asyncpg
asyncpg
is a library designed specifically for PostgreSQL database interaction. It’s one of the fastest libraries available for working with PostgreSQL from Python asynchronously.
import asyncio
import asyncpg
async def run():
conn = await asyncpg.connect(user='user', password='password', database='database', host='127.0.0.1')
values = await conn.fetch('''SELECT * FROM mytable''')
await conn.close()
return values
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
aiomysql
aiomysql
is an asynchronous library for accessing MySQL databases. It provides a simple and efficient way to interact with MySQL databases using asyncio.
import asyncio
import aiomysql
async def test_example(loop):
pool = await aiomysql.create_pool(host='127.0.0.1', port=3306,
user='root', password='',
db='test', loop=loop, autocommit=True)
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT 42;")
print(await cur.fetchone())
pool.close()
await pool.wait_closed()
loop = asyncio.get_event_loop()
loop.run_until_complete(test_example(loop))
Conclusion
Asynchronous programming in Python, with the help of asyncio
and the libraries built upon it, offers a powerful alternative to traditional synchronous coding patterns. Whether you are developing HTTP services, interacting with databases, or building fast APIs, there is likely an async library tailored to your needs. These libraries not only improve the scalability and efficiency of your applications but also provide a modern approach to asynchronous programming in Python. As async features continue to evolve and mature, we can only expect this ecosystem to grow and become even more robust.