Dockerize Your FastAPI and Celery Application
Dockerize Your FastAPI and Celery Application
Application
May 9, 2024 • 5 min read
Tags: FastAPI, Celery, Docker, Compose
A while ago I wrote a tutorial about how to use Celery with FastAPI to run
asynchronous tasks. In this post, I will explain how to dockerize the application and
simplify the deployment with Docker Compose.
fastapi==0.111.0
celery==5.4.0
redis==5.0.4
# main.py
from fastapi import FastAPI
from .tasks import celery, square_root
app = FastAPI()
@app.post('/square_root')
def process(num: float):
task = square_root.delay(num)
return {'taskId': task.id}
@app.get('/status/{task_id}')
def status(task_id: str):
task = celery.AsyncResult(task_id)
if task.ready():
nashruddinamin.com
Web Development and AI - Python, React, Rust. 1
return {'status': 'DONE', 'result': task.get()}
else:
return {'status': 'IN_PROGRESS'}
# tasks.py
import math
import time
import os
celery = Celery(
'tasks',
broker=f'redis://{REDIS_HOST}:{REDIS_PORT}/0',
backend=f'redis://{REDIS_HOST}:{REDIS_PORT}/0'
)
@celery.task
def square_root(num: float):
time.sleep(10)
return math.sqrt(num)
When you make POST request to /square_root, it will return a task ID and run the
background task to do the calculation. The background task simply use delay to
simulate long-running code.
FROM python:3.12-slim
WORKDIR /app
COPY . .
nashruddinamin.com
Web Development and AI - Python, React, Rust. 2
RUN pip install --no-cache-dir -r requirements.txt
● FROM python:3.12-slim
This line specifies the base image for the Docker image. In this case, it uses the
Python 3.12-slim image as the base, which is a lightweight version of Python.
● WORKDIR /app
This line sets the working directory inside the container to /app. It means that
all subsequent commands will be executed in this directory.
● COPY . .
This line copies the entire contents of the current directory (where the
Dockerfile resides) to the /app directory inside the container. It includes your
application code and any other necessary files.
Create a new file named docker-compose.yml and put the following contents:
services:
api:
build:
context: .
dockerfile: Dockerfile
ports:
- 8000:8000
depends_on:
- worker
environment:
- REDIS_HOST=redis
- REDIS_PORT=6379
command: uvicorn main:app --host 0.0.0.0 --port 8000
worker:
build:
nashruddinamin.com
Web Development and AI - Python, React, Rust. 3
context: .
dockerfile: Dockerfile
depends_on:
- redis
environment:
- REDIS_HOST=redis
- REDIS_PORT=6379
command: celery -A tasks worker --loglevel=info
redis:
image: redis:latest
The api service is responsible for running the FastAPI application. It depends on the
worker service, which means it will only start after worker is running. The
environment variables REDIS_HOST and REDIS_PORT are set to connect to the Redis
service.
The worker service is responsible for running the Celery worker. It depends on the
redis service and sets the REDIS_HOST and REDIS_PORT environment variables to
connect to Redis.
The redis service uses the redis:latest image to run a Redis server. It provides
the message broker for the Celery worker and the API.
docker compose up
nashruddinamin.com
Web Development and AI - Python, React, Rust. 4