0% found this document useful (0 votes)
14 views1 page

Cheat Sheet For Pool

The document discusses multiprocessing pools in Python, explaining how to create and configure pools, issue tasks synchronously and asynchronously, handle results, and use callbacks. Pools allow executing CPU-bound tasks concurrently in child processes to improve performance.

Uploaded by

hotfoodtaaste
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views1 page

Cheat Sheet For Pool

The document discusses multiprocessing pools in Python, explaining how to create and configure pools, issue tasks synchronously and asynchronously, handle results, and use callbacks. Pools allow executing CPU-bound tasks concurrently in child processes to improve performance.

Uploaded by

hotfoodtaaste
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

SuperFastPython.com Cheat Sheet for multiprocessing.

Pool
Why multiprocessing.Pool? Issue Tasks Synchronously Use AsyncResult (handles on async tasks)
Execute ad hoc functions that perform CPU-bound Issue tasks, block until complete. Via apply_async(), map_async(), starmap_async()
tasks asynchronously in new child processes, such
as compute tasks or mathematical operations. Issue one task Get result (blocking)
value = pool.apply(task, (a1, a2)) value = ar.get()
Create, Configure, Use
Issue many tasks Get result with exception
Import for val in pool.map(task, items): try:
from multiprocessing import Pool # ... value = ar.get()
except Exception as e:
Create, default config Issue many tasks, lazy # ...
pool = Pool() for val in pool.imap(task, items):
# ... Get result with timeout
Config number of workers value = ar.get(timeout=5)
pool = Pool(processes=8) Issue many tasks, lazy, unordered results
for val in pool.imap_unordered(task, Wait for task to complete (blocking)
Config worker initializer function items): ar.wait()
pool = Pool(initializer=init, # ...
initargs=(a1, a2)) Wait for task, with timeout
Issue many tasks, multiple arguments ar.wait(timeout=5)
Config max tasks per child worker items = [(1, 2), (3, 4), (5, 6)]
pool = Pool(maxtasksperchild=10) for val in pool.starmap(task, Check if task is finished (not running)
items): if ar.ready():
Config multiprocessing context # ... # ...
ctx = get_context(‘spawn’)
pool = Pool(context=ctx) Issue Tasks Asynchronously Check if task was successful (no exception)
Issue tasks, return an AsyncResult immediately. if ar.successful():
Close after tasks finish, prevent further tasks # ...
pool.close() Issue one task
ar = pool.apply_async(tsk, (a1, a2)) Async Callbacks
Terminate, kill running tasks Via apply_async(), map_async(), starmap_async()
pool.terminate() Issue many tasks
ar = pool.map_async(task, items) Add result callback, takes result as arg
Join, after close, wait for workers to stop ar = pool.apply_async(task,
pool.join() Issue many tasks, multiple arguments callback=handler)
items = [(1, 2), (3, 4), (5, 6)]
Context manager, terminate automatically ar = pool.starmap_async(task, items) Add error callback, takes error as arg
with Pool() as pool: ar = pool.apply_async(task,
# ... Chunksize error_callback=handler)
Via all versions of map() functions.

Issue multiple tasks to each worker


for val in pool.map(task, items,
chunksize=5):
# ...

You might also like