Skip to main content

Celery Asynchronous Task Execution Flow

This sequence diagram traces the lifecycle of a Celery task from its invocation on the client side to its execution in a worker and the storage of its result.

  1. Task Invocation: The client calls delay() or apply_async() on a task instance. This triggers the Celery.send_task method.
  2. Message Preparation: The Celery app uses its AMQP component to create a task message and then publishes it to the message broker (e.g., RabbitMQ or Redis) using a kombu.Producer.
  3. Task Receipt: The worker's Consumer receives the message from the broker. It identifies the task's execution strategy and creates a Request object.
  4. Execution Scheduling: The consumer passes the request to the WorkController, which dispatches it to the execution pool (e.g., prefork, eventlet).
  5. Task Execution: Inside the pool worker, the celery.app.trace logic wraps the actual task function. It handles pre-run signals, executes the function, and manages post-run activities.
  6. Result Storage: Upon successful completion, the tracer calls the BaseBackend.mark_as_done method of the configured result backend to persist the task's return value.

Key Architectural Findings:

  • Tasks are initiated via apply_async which delegates to Celery.send_task.
  • The AMQP component handles the low-level message creation and publishing via kombu.
  • The worker's Consumer uses a 'strategy' pattern to handle incoming messages based on task type.
  • Task execution is decoupled from message consumption using an execution pool (e.g., prefork).
  • The trace_task function in celery.app.trace is the core wrapper that executes the task and interacts with the result backend.
Loading diagram...