Small and idiomatic utility library for coroutine-driven asynchronous generic programming in Python.
Built on top of asyncio, paco
provides missing capabilities from Python stdlib
in order to write asynchronous cooperative multitasking in a nice-ish way.
Also, paco aims to port some of functools and itertools standard functions to the asynchronous world.
paco
can be your utility belt to deal with asynchronous, I/O-bound, non-blocking concurrent code in a cleaner and idiomatic way.
- Simple and idiomatic API, extending Python
stdlib
with async coroutines gotchas. - Built-in configurable control-flow concurrency support (throttle).
- No fancy abstractions: it just works with the plain asynchronous coroutines.
- Useful iterables, decorators, functors and convenient helpers.
- Coroutine-based functional helpers:
compose
,throttle
,partial
,timeout
,times
,until
,race
... - Asynchronous coroutines port of Python built-in functions: filter, map, dropwhile, filterfalse, reduce...
- Supports asynchronous iterables and generators (PEP0525)
- Concurrent iterables and higher-order functions.
- Better
asyncio.gather()
andasyncio.wait()
with optional concurrency control and ordered results. - Works with both async/await and yield from coroutines syntax.
- Reliable coroutine timeout limit handler via context manager.
- Designed for intensive I/O bound concurrent non-blocking tasks.
- Good interoperability with
asyncio
and Pythonstdlib
functions. - Composable pipelines of functors via
|
operator overloading. - Small and dependency free.
- Compatible with Python +3.4.
Using pip
package manager:
pip install --upgrade paco
Or install the latest sources from Github:
pip install -e git+git://github.com/h2non/paco.git#egg=paco
- paco.ConcurrentExecutor
- paco.apply
- paco.compose
- paco.concurrent
- paco.constant
- paco.curry
- paco.defer
- paco.dropwhile
- paco.each
- paco.every
- paco.filter
- paco.filterfalse
- paco.flat_map
- paco.gather
- paco.identity
- paco.interval
- paco.map
- paco.once
- paco.partial
- paco.race
- paco.reduce
- paco.repeat
- paco.run
- paco.series
- paco.some
- paco.throttle
- paco.thunk
- paco.timeout
- paco.TimeoutLimit
- paco.times
- paco.until
- paco.wait
- paco.whilst
- paco.wraps
Asynchronously and concurrently execute multiple HTTP requests.
import paco
import aiohttp
async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as res:
return res
async def fetch_urls():
urls = [
'https://www.google.com',
'https://www.yahoo.com',
'https://www.bing.com',
'https://www.baidu.com',
'https://duckduckgo.com',
]
# Map concurrent executor with concurrent limit of 3
responses = await paco.map(fetch, urls, limit=3)
for res in responses:
print('Status:', res.status)
# Run in event loop
paco.run(fetch_urls())
Concurrent pipeline-style composition of transform functors over an iterable object.
import paco
async def filterer(x):
return x < 8
async def mapper(x):
return x * 2
async def drop(x):
return x < 10
async def reducer(acc, x):
return acc + x
async def task(numbers):
return await (numbers
| paco.filter(filterer)
| paco.map(mapper)
| paco.dropwhile(drop)
| paco.reduce(reducer, initializer=0))
# Run in event loop
number = paco.run(task((1, 2, 3, 4, 5, 6, 7, 8, 9, 10)))
print('Number:', number) # => 36
MIT - Tomas Aparicio