Django/Python
Run background tasks with Celery
February 14th, 2022
In this article:
- — Quick howto
- — Celery broker (default is AMPQ)
- — See https://docs.celeryproject.org/en/stable/getting-started/backends-and-brokers/index.html#broker-instructions
- — Backend to store results in (enable it only if you need it, for performance reasons)
- — See https://docs.celeryproject.org/en/stable/getting-started/backends-and-brokers/index.html#summaries
- — Define your background tasks here
- — See https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html#using-the-shared-task-decorator
- — Periodic tasks with Celery Beat
- — Periodic tasks via Celery Beat
In today's update, we've added support for Celery. This means setting up background tasks for your Django project is now as easy as clicking an option.
For those that haven't used Celery before, it's a background task runner for Python. It's been around forever, is stable, and widely popular. Celery allows you to queue up tasks to be run by independent worker processes, outside the Python process that queued the tasks. This works even if the workers are on different servers.
Quick howto
To use Celery in your API Bakery project, enable it when building your code. It will install Celery, create Celery app instance, prepare tasks.py
file for you and add relevant documentation to README.md
. While usually you'd need to do all this manually, when you use API Bakery it's all added automatically when you enable Celery support.
Next, configure the preferred queue mechanism (or "broker" as Celery calls it) and results backend (if you want your tasks to be able to return results to the caller) in your .env
file. The default broker is AMPQ, but Redis is also well-supported.
Finally, define tasks as Python functions in tasks.py
file in your Django apps.
Here's an excerpt from .env
configuring Celery to use Redis as the broker, and disabling result backends (for optimum performance if we don't use results from tasks):
# Celery broker (default is AMPQ)
# See https://docs.celeryproject.org/en/stable/getting-started/backends-and-brokers/index.html#broker-instructions
CELERY_BROKER_URL=redis://localhost/
# Backend to store results in (enable it only if you need it, for performance reasons)
# See https://docs.celeryproject.org/en/stable/getting-started/backends-and-brokers/index.html#summaries
CELERY_BACKEND=
Then, we define tasks in core/tasks.py
(you can put tasks.py
in every relevant Django app):
from logging import getLogger
from celery import shared_task
log = getLogger(__name__)
# Define your background tasks here
# See https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html#using-the-shared-task-decorator
@shared_task
def hello_world():
log.info("Hello, World!")
Now, we need to run the Celery worker. Within your Python virtual environment, run:
celery -A project worker -l INFO
Tasks are enqueued by calling a delay
method on them. In another window, start the Django management shell and enqueue the hello_world
task:
$ python manage.py shell
> from core.tasks import hello_world
> hello_world.delay()
Now check the output from the Celery worker.
Periodic tasks with Celery Beat
While the vanilla Celery just allows us to run the tasks as soon as possible, Celery Beat allows us to run a task periodically, for example, every X seconds or on a specific day of the week.
To use this feature, write your tasks in tasks.py
as usual and configure them in your project/settings/base.py
. For example, this will run our hello_task
every 5 seconds:
# Periodic tasks via Celery Beat
CELERY_BEAT_SCHEDULE = {
'run hello task every 5 seconds': {
'task': 'core.tasks.hello_world',
'schedule': 5.0
}
}
We also need to start another Celery process - the Celery Beat. It is responsible for checking which tasks should be run and then queuing them at the appropriate time. Run:
celery -A project beat -l INFO
Now, every 5 seconds, the worker process should display the Hello World message.