Skip to content

Take a modern Python codebase to the next level of performance.

License

Notifications You must be signed in to change notification settings

bouillon/aiomultiprocess

 
 

Repository files navigation

aiomultiprocess

Take a modern Python codebase to the next level of performance.

build status code coverage version changelog license code style

On their own, AsyncIO and multiprocessing are useful, but limited: AsyncIO still can't exceed the speed of GIL, and multiprocessing only works on one task at a time. But together, they can fully realize their true potential.

aiomultiprocess presents a simple interface, while running a full AsyncIO event loop on each child process, enabling levels of concurrency never before seen in a Python application. Each child process can execute multiple coroutines at once, limited only by the workload and number of cores available.

Gathering tens of thousands of network requests in seconds is as easy as:

async with Pool() as pool:
    results = await pool.map(<coroutine>, <items>)

For more context, watch the PyCon US 2018 talk about aiomultiprocess, "Thinking Outside the GIL":

IMAGE ALT TEXT

Slides available at Speaker Deck.

Install

aiomultiprocess requires Python 3.6 or newer. You can install it from PyPI:

$ pip3 install aiomultiprocess

Usage

Most of aiomultiprocess mimics the standard multiprocessing module whenever possible, while accounting for places that benefit from async functionality.

Executing a coroutine on a child process is as simple as:

import asyncio
from aiohttp import request
from aiomultiprocess import Process

async def put(url, params):
    async with request("PUT", url, params=params) as response:
        pass

async def main():
    p = Process(target=put, args=("https://jreese.sh", {}))
    await p

if __name__ == "__main__":
    asyncio.run(main())

If you want to get results back from that coroutine, Worker makes that available:

import asyncio
from aiohttp import request
from aiomultiprocess import Worker

async def get(url):
    async with request("GET", url) as response:
        return await response.text("utf-8")

async def main():
    p = Worker(target=get, args=("https://jreese.sh", ))
    response = await p

if __name__ == "__main__":
    asyncio.run(main())

If you want a managed pool of worker processes, then use Pool:

import asyncio
from aiohttp import request
from aiomultiprocess import Pool

async def get(url):
    async with request("GET", url) as response:
        return await response.text("utf-8")

async def main():
    urls = ["https://jreese.sh", ...]
    async with Pool() as pool:
        result = await pool.map(get, urls)

if __name__ == "__main__":
    asyncio.run(main())

License

aiomultiprocess is copyright John Reese, and licensed under the MIT license. I am providing code in this repository to you under an open source license. This is my personal repository; the license you receive to my code is from me and not from my employer. See the LICENSE file for details.

About

Take a modern Python codebase to the next level of performance.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 97.5%
  • Makefile 2.5%