Asyncio is Python’s built-in framework for writing asynchronous code — code that can pause while waiting for something (like a network response) and let other code run in the meantime. It’s single-threaded, but incredibly efficient for I/O-bound tasks.
Coroutines: The Building Blocks
A coroutine is a function defined with async def. It doesn’t run when we call it — it returns a coroutine object that we need to await.
import asyncio
async def greet(name):
print(f"Hello, {name}!")
await asyncio.sleep(1) # non-blocking pause
print(f"Goodbye, {name}!")
# This is how we run it
asyncio.run(greet("Manish"))
The await keyword is where the magic happens. When Python hits await, it pauses that coroutine and goes to do other work. When the awaited thing is done, it comes back and continues.
The Event Loop
The event loop is the heart of asyncio. It keeps track of all running coroutines, figures out which ones are ready to continue, and switches between them.
await → pauses A
await → pauses B
await → pauses C
Running Multiple Tasks with gather()
The real power of asyncio is running many things concurrently. asyncio.gather() runs multiple coroutines at the same time and waits for all of them.
import asyncio
async def fetch(url):
print(f"Fetching {url}...")
await asyncio.sleep(2) # simulating network delay
return f"Data from {url}"
async def main():
# These run concurrently, not one after another
results = await asyncio.gather(
fetch("api.com/users"),
fetch("api.com/posts"),
fetch("api.com/comments"),
)
print(results) # all three results, took ~2s total
asyncio.run(main())
Without gather, three sequential fetches would take ~6 seconds. With it, they overlap and take ~2 seconds.
Creating Tasks
asyncio.create_task() schedules a coroutine to run in the background. We can do other things while the task runs.
async def main():
task = asyncio.create_task(fetch("api.com/data"))
# do other stuff while task runs in background
print("Doing other work...")
await asyncio.sleep(1)
# now get the result
result = await task
print(result)
The difference: await fetch(...) runs it and waits. create_task(fetch(...)) starts it in the background — we await it later when we need the result.
Real-World Use: aiohttp
asyncio.sleep() is great for learning, but in practice we use async-compatible libraries. aiohttp is the go-to for HTTP requests.
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
return await response.json()
async def main():
async with aiohttp.ClientSession() as session:
data = await asyncio.gather(
fetch(session, "https://api.example.com/users"),
fetch(session, "https://api.example.com/posts"),
)
print(data)
asyncio.run(main())
asyncio vs threading
| Feature | asyncio | threading |
|---|---|---|
| Threads | Single thread | Multiple threads |
| Switching | Cooperative (at await) | Preemptive (OS decides) |
| Race conditions | Rare (explicit yield points) | Common (need locks) |
| Best for | Many I/O tasks (1000+ connections) | Fewer I/O tasks, simpler code |
| Learning curve | Steeper | Gentler |
In simple language, asyncio is like a really efficient waiter at a restaurant. Instead of standing at one table waiting for the kitchen, the waiter takes orders from all tables and brings food as it’s ready — all by themselves, no extra waiters needed.