By using our website, you agree to our privacy policy

Ruddra.com

Docker: Use Celery in Django(Redis as Broker)

Docker: Use Celery in Django(Redis as Broker)

In previous two posts, we have deployed Django with Postgres, Nginx, now its time to do some async stuff using Celery. In this post, I will do the magic tricks first, explain them later.

Add ‘Celery’ to django

To add celery, we need make a container for celery. We can re-use the Dockerfile of django for making celery’s container like this:

FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV C_FORCE_ROOT true
RUN mkdir /src
RUN mkdir /static
WORKDIR /src
ADD ./src /src
RUN pip install -r requirements.pip
CMD python manage.py collectstatic --no-input;python manage.py migrate; gunicorn mydjango.wsgi -b 0.0.0.0:8000 & celery worker --app=myapp.tasks

** FYI We are avoiding Python 3.7 or **latest** docker image, because celery conflicts with Python’s async api(reference).

Now lets update the docker-compose.yml file to link django with celery, also link redis container to celery.

version: "2"
services:
  nginx:
    image: nginx:latest
    container_name: nz01
    ports:
      - "8000:8000"
    volumes:
      - ./src:/src
      - ./config/nginx:/etc/nginx/conf.d
    depends_on:
      - web
  web:
    build: .
    container_name: dz01
    depends_on:
      - db
    volumes:
      - ./src:/src
    expose:
      - "8000"
    links:
      - redis
  db:
    image: postgres:latest
    container_name: pz01
  redis:
    image: redis:latest
    container_name: rz01
    ports:
      - "6379:6379"

Now in django project, lets add broker url for celery in settings.py:

CELERY_BROKER_URL = 'redis://redis:6379/0'
CELERY_RESULT_BACKEND = 'redis://redis:6379/0'

Integration with celery from django is complete. You don’t need to read the next section if you already have celery tasks.

Make a simple async task

How to use celery from django? There is an amazing documentation about that on celery’s own documentation page.

To make this post post short, I will not get to details, but do basic stuff for making a simple async task using celery.

Lets add a celery.py inside mydjango>mydjango directory.

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import logging

logger = logging.getLogger("Celery")
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mydjango.settings')
app = Celery('mydjango')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

And create a new app named myapp, add it to django settings:

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'myapp.apps.MyappConfig',
    'celery', # Don't forget to add celery
]

Now we need to install celery and redis by using:

pip install celery
pip install redis

or we can add them to config>requirements.pip.

Now lets add a simple email sending task in src>mydjango>myapp>tasks.py

from __future__ import absolute_import, unicode_literals
import logging

from django.conf import settings
from mydjango.celery import app

logger = logging.getLogger("celery")


@app.task
def show_hello_world():
    logger.info("-"*25)
    logger.info("Printing Hello from Celery")
logger.info("-"*25)

Thats it, that should do the trick. Now just run docker-compose build and docker-compose run to make the project running. If you do then you should see an output like this:

demo

Explanations

In first step, we have updated Dockerfile which was responsible for building django application’s environment.

FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV C_FORCE_ROOT true
RUN mkdir /src
RUN mkdir /static
WORKDIR /src
ADD ./src /src
RUN pip install -r requirements.pip
CMD python manage.py collectstatic --no-input;python manage.py migrate; gunicorn mydjango.wsgi -b 0.0.0.0:8000 & celery worker --app=myapp.tasks

Here we are using python 3.6 image and inside it creating src & static directory. After that, we have added local src directory to docker’s src, and made it working directory(whenever you go into docker, you will be directly inside src folder). Then we installed the requirements. Finally, the CMD command runs collect static, migration, gunicorn and in the end creates celery workers.

In docker-compose.yml, we have are adding nothing new from last step.

Now the new celery will be running in the old django container.

Celery will run this command: celery worker --app=myapp.tasks, which will execute tasks within an app named myapp.

Need proof that this works

Go to this github link and pull and build. Don’t forget to update email configurations inside the settings of django.

Checkout previous posts about docker:

  1. Deploy Django, Gunicorn, NGINX, Postgresql using Docker
  2. Serve Static Files by Nginx from Django using Docker

Last updated: May 27, 2020

  • x1

x1

Share Your Thoughts
M ↓   Markdown

R3SE
R3SE
Thursday, June 15, 2017

I’m getting an error : ModuleNotFoundError: No module named ‘utility’, with celery, why is that ?

Ruddra
Wednesday, April 25, 2018

It should be fixed now. Please pull the latest code from repository

sebadima
sebadima
Thursday, March 26, 2020

you should not use heroku for glog comments it starts too late and i cannot see my previous 2 attempts of commenting..

i asked how to connect wirh psql after typing make shell-db

grate article anyway!!!!!!!!!!

Ruddra
Friday, March 27, 2020

You can connect like this:

First, connect to shell:

docker exec -ti <container id> /bin/sh

Or use make shell-db

Then, run psql to access postgres. Or if you have set up POSTGRES_USER, POSTGRES_DB and POSTGRES_PASSWORD, then use psql -d <value of POSTGRES_DB> -U <value of POSTGRES_USER> -W. It will prompt you to input your password.

On the other note, I am using Heroku for its free dynos. If you can suggest something as free alternative, I would appreciate that 😄