0% found this document useful (0 votes)
70 views

Cheat Sheet For Process Pool Executor

The document provides an overview of using ProcessPoolExecutor in Python to execute asynchronous CPU-bound tasks in child processes. It describes how to create and configure a ProcessPoolExecutor, submit tasks to the executor, retrieve results using Futures, and wait on tasks. Key points covered include submitting single tasks or batches of tasks, getting results synchronously or asynchronously, and shutting down the executor.

Uploaded by

xmetal123dev
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views

Cheat Sheet For Process Pool Executor

The document provides an overview of using ProcessPoolExecutor in Python to execute asynchronous CPU-bound tasks in child processes. It describes how to create and configure a ProcessPoolExecutor, submit tasks to the executor, retrieve results using Futures, and wait on tasks. Key points covered include submitting single tasks or batches of tasks, getting results synchronously or asynchronously, and shutting down the executor.

Uploaded by

xmetal123dev
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

SuperFastPython.

com Cheat Sheet for ProcessPoolExecutor


Why ProcessPoolExecutor? Issue Async Tasks to the Pool Use Futures (handles on async tasks)
Execute ad hoc functions that perform CPU-bound
tasks asynchronously in new child processes, such Issue one task asynchronously Get result (blocking)
as compute tasks or mathematical operations. future = pool.submit(task) result = future.result()

Create, Configure, Use Issue one task with arguments Get result with exception
future = pool.submit(task, a1, a2) try:
Imports result = future.result()
from concurrent.futures import * Issue many tasks, collect Futures except Exception as e:
futures = [pool.submit(task) for _ # ...
Create, default config in range(5)]
pool = ProcessPoolExecutor() Get result with timeout in seconds
Issue many tasks, iterate results in order try:
Config number of workers for res in pool.map(task, res = future.result(timeout=0.5)
pool = range(10)): except TimeoutError as e:
ProcessPoolExecutor(max_workers=10) # ... # ...

Config multiprocessing context Issue many tasks, iterate results with timeout Get an exception
ctx = try: exception = future.exception()
multiprocessing.get_context(‘spawn’) for r in pool.map(task,
pool = range(10), timeout=0.5): Get exception with timeout in seconds
ProcessPoolExecutor(mp_context=ctx) # ... try:
except TimeoutError as e: e =
Config worker initializer function # ... future.exception(timeout=0.5)
pool = except TimeoutError as toe:
ProcessPoolExecutor(initializer=init Issue many tasks in chunks, iterate results # ...
, initargs=(a1, a2)) for res in pool.map(task, range(10),
chunksize=10): Cancel a running task
Shutdown and wait, not cancel tasks # ... cancelled = future.cancel()
pool.shutdown()
Wait for all tasks via futures Check if task is running (not done)
Shutdown no wait, not cancel tasks wait(futures) if future.running():
pool.shutdown(wait=False) # ...
Wait with a timeout in seconds
Shutdown and wait, cancel tasks wait(futures, timeout=0.5) Check if task done (not running)
pool.shutdown(cancel_futures=True) if future.done():
Wait for first task # ...
Shutdown no wait, cancel tasks wait(futures, FIRST_COMPLETED)
pool.shutdown(wait=False, Check if task cancelled
cancel_futures=True) Wait for for first task failure if future.cancelled():
wait(futures, FIRST_EXCEPTION) # ...
Context manager, shutdown automatically
with ProcessPoolExecutor() as e: Iterate futures in order completed Add a task done callback
# ... for fut in as_completed(futures): future.add_done_callback(myfunc)
# ...

You might also like