Asynchronous Tasks With FastAPI and Celery
Asynchronous Tasks With FastAPI and Celery
Asynchronous Tasks With FastAPI and Celery
and Celery
Jan 28, 2024 • 5 min read
Tags: Python, FastAPI, Celery
Overview
When you have a long running Python function that you want to expose via an API
endpoint, you might want to run the function asynchronously to avoid timeouts.
Running a function asynchronously means it won’t block the rest of your code. Instead
of waiting the function to finish the task, your program can continue executing other
tasks, making it more responsive.
In this post, I will show you how to use Celery to execute tasks asynchronously in your
FastAPI application. We will use the function below to simulate long-running code:
import math
import time
In the real world applications, this might be sending emails to users, processing long
duration videos, or training ML models.
Celery requires a message broker to transmit messages between the client (where
tasks are initiated) and the workers (where tasks are executed). There are many
options for messages brokers to be used with Celery. In this post, we will use Redis.
nashruddinamin.com
Web Development and AI - Python, React, Rust. 1
● /square_root to initiate the execution of the function.
● /status/<task_id> to get the status of the task and retrieve the result.
Create a new file named main.py and write dummy API endpoints with FastAPI:
# main.py
from fastapi import FastAPI
app = FastAPI()
@app.post('/square_root')
def process(num: float):
return {'taskId': None}
@app.get(f'/status/{task_id}')
def status(task_id: str):
return {'status': 'IN_PROGRESS'}
The server will run on localhost and listening on port 8000. You can test the two
endpoints we created earlier with Curl or HTTPie and see that the endpoints return
the dummy responses.
# tasks.py
import math
import time
import os
from celery import Celery
celery = Celery(
'tasks',
nashruddinamin.com
Web Development and AI - Python, React, Rust. 2
broker=f'redis://{REDIS_HOST}:{REDIS_PORT}/0',
backend=f'redis://{REDIS_HOST}:{REDIS_PORT}/0'
)
@celery.task
def square_root(num: float):
time.sleep(10)
return math.sqrt(num)
In this file, we configure Celery with Redis as the message broker. The @celery.task
decorator transforms the square_root function into a Celery task, allowing it to be
scheduled and executed asynchronously.
The command will start a Celery worker that will process tasks defined in the tasks
module. It also sets the logging level to “info” to provide more detailed information
about the tasks being processed.
# main.py
...
@app.post('/square_root')
def process(num: float):
task = square_root.delay(num)
return {'taskId': task.id}
@app.get(f'/status/{task_id}')
def status(task_id: str):
task = celery.AsyncResult(task_id)
if task.ready():
return {'status': 'DONE', 'result': task.get()}
else:
return {'status': 'IN_PROGRESS'}
nashruddinamin.com
Web Development and AI - Python, React, Rust. 3
In the /square_root endpoint, the code executes the square_root function
asynchronously and return the ID of the task. The client then need to check the status
of the task using that ID.
In the /status/<task_id> endpoint, the code accepts a task ID and returns its
status. If the task result is already available, the endpoint will also return it.
Next steps
After you build your FastAPI and Celery application, you might want to dockerize the
code and use Docker Compose to simplify the deployment.
nashruddinamin.com
Web Development and AI - Python, React, Rust. 4