Django and Celery

You might be able to get away without using an asynchronous queueing system for your first few backend projects. But soon enough, you're going to need one!

During my 15+ years in backend development, I've had cause to use a few different ones, each with their own pros and cons. I've generally relied on Django-Q, but the original project maintainer appears to have gone AWOL. This gave rise to Django-Q2 which incorporated a number of patches and breathed a bit of life into the project. It's generally met my needs as a straightforward Django-friendly way to do queues.

The go-to option for many folk seems to have been Celery. I've always considered using it, but it seemed a bit too heavyweight for my needs. On a recent push for our customer ORDNA I decided that the time had come to try it out.

Basic setup for Django and Celery

The Celery documentation is very good, and the First steps with Celery is the sensible place to get started. If you're looking to do this all with Django, I'd recommend the following recipe:

  • Use a Redis broker - easy enough to set one up with Docker locally as follows: docker run -d -p 6379:6379 redis
  • Install celery - pip install celery
  • Set up Celery to integrate with your Django application

The last step is probably the fiddliest. Under your project/project directory (usually where your Django settings.py file is located), create a celery.py that looks something like the following (note that my project name is 'ordna' - yours WILL be different):

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ordna.settings')
app = Celery('ordna')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

and then add the following to the init.py in the same directory. You may need to create it if it doesn't already exist:

from .celery import app as celery_app

__all__ = ('celery_app',)

It's probably a good idea to ensure you have a way to retrieve results in Django as well. To do this you will need some kind of "results backend" where the results of queued tasks can be stored. There's a nice version of this which makes use of the Django database, so why not start there:

pip install django-celery-results

then add django_celery_results to your application INSTALLED_APPS in your settings file. You'll also need to run:

python manage.py migrate

so that the necessary tables are created for results storage.

in your settings file you can now do the following:

CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'redis://localhost'

and you are ready to go!

Setting up an endpoint

The documentation covers how to write a very simple queuable "add" function. I more-or-less copied it into a file called "tasks.py" in the root directory of my application. My Django application is called "main" and so I created it under main/tasks.py.

from celery import shared_task
import logging

logger = logging.getLogger(__name__)

@shared_task
def add(x, y):
    return x + y

As I'm using Django REST Framework (DRF), I set myself up a very simple endpoint as follows:

from rest_framework.permissions import AllowAny
from rest_framework.renderers import JSONRenderer
from rest_framework.response import Response
from rest_framework.views import APIView

from main.tasks import add
from ordna.celery import app


class AsyncTests(APIView):
    permission_classes = [AllowAny]
    renderer_classes = [JSONRenderer]
    schema = None

    def get(self, request, format=None):
        result = app.AsyncResult(request.data["id"])
        output = None
        if result.state == "SUCCESS":
            output = result.get()

        return Response({"state": result.state, "output": output})

    def post(self, request, format=None):
        result = add.delay(int(request.data["first"]), int(request.data["second"]))
        return Response({"id": str(result.id)})

    ```

Again, note that my project is named 'ordna' and yours will be different. Don't forget to add the API to your urls.py:

```urlpatterns = [
    ...
    path("api/asynctests/", views.AsyncTests.as_view()),
    ...

Running the backend and the worker

Once you've done the above you now have to start two terminals. In one, you can run your usual local Django server:

python manage.py runserver

In the other terminal, you'll need to run a Celery worker. This is the entity that is responsible for ensuring tasks that you have queued are executed and the results are saved in the results backend.

celery -A ordna worker --loglevel=INFO

Trying it out

You're now ready to try adding two numbers together! I use httpie and here's how I tested it (from yet another terminal window):

[nclarey@destructor:2004 ~]$ http post http://localhost:8000/api/asynctests/ first=1 second=2
HTTP/1.1 200 OK
Allow: GET, POST, HEAD, OPTIONS
Content-Length: 45
Content-Type: application/json
Cross-Origin-Opener-Policy: same-origin
Referrer-Policy: same-origin
Server: daphne
Vary: Origin
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "id": "303ff04b-6cd3-4092-8690-21dcf273484c"
}

Super! We've submitted a request via Django to Celery. If you keep half an eye on the Celery worker process window you'll see something like this:

[2025-02-20 04:44:10,248: INFO/MainProcess] mingle: searching for neighbors
[2025-02-20 04:44:11,264: INFO/MainProcess] mingle: all alone
[2025-02-20 04:44:11,315: INFO/MainProcess] celery@destructor ready.
[2025-02-20 04:44:41,835: INFO/MainProcess] Task main.tasks.add[303ff04b-6cd3-4092-8690-21dcf273484c] received
[2025-02-20 04:44:41,844: INFO/ForkPoolWorker-7] Task main.tasks.add[303ff04b-6cd3-4092-8690-21dcf273484c] succeeded in 0.008057582890614867s: 3

Looks good! Let's retrieve the result using a GET to the endpoint, specifying the ID of the task:

[nclarey@destructor:2005 ~]$ http get http://localhost:8000/api/asynctests/ id=303ff04b-6cd3-4092-8690-21dcf273484c
HTTP/1.1 200 OK
Allow: GET, POST, HEAD, OPTIONS
Content-Length: 30
Content-Type: application/json
Cross-Origin-Opener-Policy: same-origin
Referrer-Policy: same-origin
Server: daphne
Vary: Origin
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "output": 3,
    "state": "SUCCESS"
}

We've done it, and have got basic a basic task queue up and running with Django.

Wrapping up

As simple as that, we've configured Celery to run with Django, added a simple task, integrated with an endpoint and tried it out. We'll give some more sophisticated examples in the coming weeks.

Nick Clarey
in Technical Tagged technical django python celery

Airsource design and develop apps for ourselves and for our clients. We help people like you to turn concepts into reality, taking ideas from initial design, development and ongoing maintenance and support.

Contact us today to find out how we can help you build and maintain your app.