Celery for Asynchronous Tasks
A distributed task queue for Python applications to run tasks asynchronously.
2025-03-08T09:19:25.233Z Back to posts
Celery: A Distributed Task Queue for Asynchronous Tasks
======================================================
Introduction
Celery is a popular open-source distributed task queue written in Python that allows you to run tasks asynchronously. It’s designed to handle tasks that are time-consuming, CPU-intensive, or I/O-bound, and can be used with various message brokers such as RabbitMQ, Redis, or SQLite.
Why Use Celery?
Celery offers several benefits for asynchronous task processing:
- Improved responsiveness: By running tasks in the background, your application remains responsive to user requests.
- Scalability: Celery allows you to distribute tasks across multiple workers, making it ideal for large-scale applications.
- Reliability: Tasks are stored in a message broker and can be retried if they fail.
Basic Components
A Celery application consists of three main components:
Component | Description |
---|---|
Broker | Stores tasks and messages between the application and workers. Examples include RabbitMQ, Redis, and SQLite. |
Worker | Executes tasks from the queue. Can be run on multiple machines for load balancing and high availability. |
Client | Used by your application to send tasks to the broker. |
Configuring Celery
Celery is typically configured using a celeryconfig.py
file, which defines settings such as:
- The message broker URL
- The worker concurrency (number of tasks executed simultaneously)
- The task queue name
# celeryconfig.py
broker_url = 'amqp://guest:guest@localhost:5672//' # RabbitMQ example
result_backend = 'db+sqlite:///results.sqlite' # SQLite example
Creating Tasks
Tasks are Python functions decorated with the @app.task
decorator:
# tasks.py
from celery import shared_task
@app.task
def add(x, y):
return x + y
Sending and Executing Tasks
Tasks are sent to the broker using the client API:
# my_app/tasks.py
from .tasks import add # Import the task function
@shared_task
def my_task():
result = add.delay(4, 4) # Send a task to the queue
return result.get() # Retrieve the task result
Monitoring and Debugging
Celery provides several tools for monitoring and debugging:
- Celery Flower: A web-based interface for monitoring tasks, workers, and the message broker.
- Celery Beat: A scheduler that allows you to schedule periodic tasks.
Example Use Case: Image Processing
Celery can be used to process large images in the background, improving application responsiveness and reducing load on the server:
# image_tasks.py
from PIL import Image # Import the Pillow library for image processing
from celery import shared_task
@app.task
def resize_image(image_path):
img = Image.open(image_path)
width, height = img.size
new_size = (800, int(height * 800 / width)) # Resize the image to fit within a 800x600 box
img.thumbnail(new_size) # Resize the image in place
img.save('resized_image.jpg') # Save the resized image
# my_app/tasks.py
from .image_tasks import resize_image
@shared_task
def process_images():
images = get_images_from_database() # Retrieve a list of images to process
for image_path in images:
resize_image.delay(image_path) # Send tasks to the queue
Conclusion
Celery provides a powerful and flexible solution for asynchronous task processing, allowing you to improve responsiveness, scalability, and reliability. With its simple configuration and rich feature set, Celery is an ideal choice for building scalable web applications.