Skip to main content

Getting Started with the Celery App

In this tutorial, you will build a functional Celery application that can process background tasks. You will learn how to initialize the core application instance, configure a message broker and result backend, and define your first asynchronous task.

Prerequisites

To follow this tutorial, you need a message broker installed and running. Celery supports several brokers, but RabbitMQ is the default recommended choice.

  • RabbitMQ: Ensure it is running on localhost with the default guest credentials.
  • Python Environment: You should have the celery package installed in your environment.

Step 1: Initialize the Celery Application

The first step is to create an instance of the Celery class from celery.app.base. This instance acts as the entry point for everything you do in Celery, such as creating tasks and managing workers.

Create a file named tasks.py and add the following code:

from celery import Celery

app = Celery(
'tasks',
broker='pyamqp://guest@localhost//',
backend='rpc://'
)

What this does

  • The first argument ('tasks') is the name of the main module. This is used to generate unique names for your tasks when they are registered.
  • The broker keyword argument specifies the URL of the message broker. Here, we use pyamqp://, which is the protocol for RabbitMQ.
  • The backend keyword argument specifies the result store. Using rpc:// sends results back as AMQP messages, which is useful for simple setups.

Step 2: Define Your First Task

Tasks are the building blocks of Celery. You create them by decorating a standard Python function with the @app.task decorator provided by your application instance.

Add this task to your tasks.py file:

@app.task
def add(x, y):
return x + y

What this does

The @app.task decorator tells Celery that this function should be registered in the application's task registry (app.tasks). When you call this function using .delay(), Celery doesn't execute it locally; instead, it sends a message to the broker for a worker to pick up.

Step 3: Configure Advanced Settings (Optional)

For larger projects, you might want to separate your configuration from your application logic. You can use the config_from_object method to load settings from a dedicated module or a configuration class.

# Example of loading configuration from a module
app.config_from_object('celeryconfig')

# Or directly setting a configuration value
app.conf.task_serializer = 'json'

In this project, app.conf is a Settings object that allows you to modify behavior like serialization, task routing, or concurrency limits.

Step 4: Start the Celery Worker

To execute the tasks you've defined, you need to run a worker process. You can do this using the start() method within your script or via the command line.

Add this block to the end of your tasks.py:

if __name__ == '__main__':
app.start()

Now, open your terminal and start the worker:

python tasks.py worker --loglevel=info

What this does

The app.start() method invokes the Celery command-line interface. By passing worker as an argument, you tell Celery to start a process that listens to the broker for new tasks and executes them using the add function you defined.

Complete Result

Your final tasks.py should look like this:

from celery import Celery

# 1. Initialize the app
app = Celery(
'tasks',
broker='pyamqp://guest@localhost//',
backend='rpc://'
)

# 2. Define a task
@app.task
def add(x, y):
return x + y

# 3. Entry point for the worker
if __name__ == '__main__':
app.start()

To verify it works, open a separate Python shell while the worker is running:

from tasks import add

# Send the task to the worker
result = add.delay(4, 4)

# Get the result from the backend
print(f"Task result: {result.get(timeout=10)}")

Next Steps

  • Explore Task Routing to send different tasks to different queues.
  • Use app.autodiscover_tasks() in larger projects (like Django) to automatically find tasks in multiple modules.
  • Learn about Signatures and Canvas to build complex workflows by chaining tasks together.