Getting Started
Celery is a powerful, flexible, and reliable distributed task queue that allows you to process vast amounts of messages while providing operations with the tools to maintain such a system. It focuses on real-time processing while also supporting task scheduling.
Prerequisites
- Python: Version 3.10 or newer.
- Message Broker: Celery requires a message broker to send and receive messages.
- RabbitMQ: (Recommended) Feature-complete and stable.
- Redis: Also feature-complete, but more susceptible to data loss in the event of abrupt termination.
- System Dependencies: (Optional, for some extensions)
build-essential,libssl-dev,libffi-dev(for building dependencies from source).
Installation
Install Celery using pip:
pip install -U celery
If you plan to use Redis as the broker or result backend, install the redis bundle:
pip install "celery[redis]"
Hello World / Quick Start
To create a minimal Celery application, follow these steps:
1. Create your application
Create a file named tasks.py:
from celery import Celery
# Initialize Celery with a name and a broker URL
# For RabbitMQ: 'pyamqp://guest@localhost//'
# For Redis: 'redis://localhost:6379/0'
app = Celery('tasks', broker='pyamqp://guest@localhost//')
@app.task
def add(x, y):
return x + y
2. Run the Celery worker server
Open a terminal and start the worker process:
celery -A tasks worker --loglevel=info
3. Call the task
In a separate Python shell or script, you can call the task asynchronously using the delay() method:
from tasks import add
# This sends the task to the broker and returns an AsyncResult instance
result = add.delay(4, 4)
# You can check if the task is ready (requires a result backend configured)
print(f"Task ready: {result.ready()}")
Configuration
While Celery works with minimal configuration, you can customize it using app.conf or by loading a configuration module.
Using a Result Backend
If you want to keep track of task states and return values, you must configure a result backend:
app = Celery('tasks',
broker='pyamqp://',
backend='redis://localhost')
Loading from a module
For larger projects, it is recommended to use a dedicated configuration module:
app = Celery('proj')
app.config_from_object('celeryconfig')
Example celeryconfig.py:
broker_url = 'pyamqp://'
result_backend = 'rpc://'
task_serializer = 'json'
result_serializer = 'json'
accept_content = ['json']
timezone = 'Europe/Oslo'
enable_utc = True
Verify Installation
You can verify that Celery is installed correctly and check your configuration using the report command:
celery report
To check the version:
celery --version
Other Install Options
Docker
You can run Celery within a Docker container. A docker-compose.yml is provided in the repository for development:
docker-compose up
From Source
To install the latest development version:
pip install https://github.com/celery/celery/zipball/main#egg=celery
Next Steps
- Periodic Tasks: Use Introduction to Celery Beat to schedule tasks to run at specific intervals.
- Canvas: Create complex workflows using Introduction to Task Signatures, Groups: Parallel Task Execution, and Chains: Sequential Task Execution.
- Monitoring: Use tools like Flower for real-time monitoring of your workers and tasks.
- Production Deployment: Read the Platform Integration: Daemons, PIDs, and Signals guide for running Celery as a system service.
Troubleshooting
- Worker not starting: Ensure your broker (RabbitMQ/Redis) is running and accessible at the configured URL.
- Tasks not executing: Check the worker logs for connection errors. Ensure the worker is initialized with the correct app module using
-A <module_name>. - Results always pending: Ensure you have configured a
result_backendand that the worker has access to it. - Windows Support: Celery does not officially support Windows, though it may work for development. For production, Linux or macOS is recommended.