Docker: Use Celery in Django(Redis as Broker)

In previous two posts, we have deployed Django with Postgres, Nginx, now its time to do some async stuff using Celery. In this post, I will do the magic tricks first, explain them later.

Add Celery to Django

To add celery, we need make a container for celery. We can re-use the Dockerfile of django for making celery's container like this:

FROM python:latest  
ENV PYTHONUNBUFFERED 1

#ENV C_FORCE_ROOT true # intentionally kept it commented

ENV APP_USER user  
ENV APP_ROOT /src

RUN groupadd -r ${APP_USER} \  
    && useradd -r -m \
    --home-dir ${APP_ROOT} \
    -s /usr/sbin/nologin \
    -g ${APP_USER} ${APP_USER}

WORKDIR ${APP_ROOT}

RUN mkdir /config  
ADD config/requirements.txt /config/  
RUN pip install -r /config/requirements.txt

USER ${APP_USER}  
ADD . ${APP_ROOT}

Now lets update the docker-compose.yml file to link django with celery, also link redis container to celery.

version: '2'  
services:  
  nginx:
    image: nginx:latest
    container_name: nx01
    ports:
      - "8001:8001"
    volumes:
      - ../src:/src
      - ./static:/static
      - ./media:/media/
      - ./config/nginx:/etc/nginx/conf.d
    depends_on:
      - web
  web:
    build: .
    container_name: dg01
    command: gunicorn mydjango.wsgi 0.0.0.0:8000

    depends_on:
      - db
    links:
      - redis
    volumes:
      - ../src:/src
      - ./static:/static
      - ./media:/media/
    expose:
      - "8001"
  db:
    image: postgres:latest
    container_name: pq01
    ports:
     - "5432:5432"

  redis:
    image: redis:latest
    container_name: rd01
    ports:
     - '6379:6379'

  celery:
    build: .
    container_name: cl01
    command: celery worker --app=app.tasks
    volumes:
      - ..:/src
    links:
      - db
      - redis

Now in django project, lets add broker url for celery in settings.py:

CELERY_BROKER_URL = 'redis://redis:6379/0'  
CELERY_RESULT_BACKEND = 'redis://redis:6379/0'  

Integration with celery from django is complete. You don't need to read the next section if you already have celery tasks.

Make a Simple Async Task

How to use celery from django? There is an amazing documentation about that on celery's own documentation page.

To make this post post short, I will not get to details, but do basic stuff for making a simple async task using celery.

Lets add a celery.py inside mydjango>mydjango directory.

from __future__ import absolute_import, unicode_literals  
import os  
from celery import Celery  
import logging  
logger = logging.getLogger("Celery")

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mydjango.settings')

app = Celery('mydjango')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):  
    print('Request: {0!r}'.format(self.request))

And create a new app named myapp, add it to django settings:

INSTALLED_APPS = [  
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'myapp.apps.MyappConfig',
    'celery', # Don't forget to add celery
]

Now we need to install celery and redis by using:

pip install celery  
pip install redis  

or we can add them to config>requirements.pip.

Now lets add a simple email sending task in src>mydjango>myapp>tasks.py

from __future__ import absolute_import, unicode_literals

from django.conf import settings  
from django.core.mail import EmailMultiAlternatives  
from mydjango.celery import app


@app.task
def send_email(recipient_list, subject, body, from_address):  
    if not isinstance(recipient_list, list):
        recipient_list = [recipient_list]
    if not from_address:
        from_address = getattr(settings, 'EMAIL_FROM_ADDRESS')

    msg = EmailMultiAlternatives(subject, body, from_address, recipient_list)
    msg.send()

Also add smtp configuration in your django settings.

Thats it, that should do the trick. Now just run docker-compose build and docker-compose run to make the project running.

Explanations

In first step, we have updated Dockerfile which was responsible for building django application's environment.

ENV APP_USER user  
ENV APP_ROOT /src

RUN groupadd -r ${APP_USER} \  
    && useradd -r -m \
    --home-dir ${APP_ROOT} \
    -s /usr/sbin/nologin \
    -g ${APP_USER} ${APP_USER}

The above lines have added APP_USER to user group in the environment and APP_ROOT will be home directory and APP_USER doesn't require login. This step is necessary for Celery.

Rest of the file is about creating a directory named src and make it working directory. make config directory, put requirements.pip file in it and install the packages using pip. And Of course, we are pulling from latest python image.

In docker-compose.yml, we have newly added a container named celery(cl01), and it will be built from Dockerfile which resides with docker-compose.yml in same directory. redis container will be linked to celery, redis can be communicable from port 6379.

Now the new celery container will be linked to the old django container.

celery will run this command: celery worker --app=myapp.tasks, which will execute tasks within an app named myapp.

Need proof that this works?

Go to this github link and pull and build. Don't forget to update email configurations inside the settings of django.

Checkout previous posts about docker:
1. Deploy Django, Gunicorn, NGINX, Postgresql using Docker
2. Serve Static Files by Nginx from Django using Docker

comments powered by Disqus