By using our website, you agree to our privacy policy

Docker: Use Celery in Django(Redis as Broker)

Docker: Use Celery in Django(Redis as Broker)

In previous two posts, we have deployed Django with Postgres, Nginx, now its time to do some async stuff using Celery. In this post, I will do the magic tricks first, explain them later.

Add ‘Celery’ to django

To add celery, we need make a container for celery. We can re-use the Dockerfile of django for making celery’s container like this:

FROM python:3.6
RUN mkdir /src
RUN mkdir /static
ADD ./src /src
RUN pip install -r requirements.pip
CMD python collectstatic --no-input;python migrate; gunicorn mydjango.wsgi -b & celery worker --app=myapp.tasks

** FYI We are avoiding Python 3.7 or **latest** docker image, because celery conflicts with Python’s async api(reference).

Now lets update the docker-compose.yml file to link django with celery, also link redis container to celery.

version: "2"
    image: nginx:latest
    container_name: nz01
      - "8000:8000"
      - ./src:/src
      - ./config/nginx:/etc/nginx/conf.d
      - web
    build: .
    container_name: dz01
      - db
      - ./src:/src
      - "8000"
      - redis
    image: postgres:latest
    container_name: pz01
    image: redis:latest
    container_name: rz01
      - "6379:6379"

Now in django project, lets add broker url for celery in

CELERY_BROKER_URL = 'redis://redis:6379/0'
CELERY_RESULT_BACKEND = 'redis://redis:6379/0'

Integration with celery from django is complete. You don’t need to read the next section if you already have celery tasks.

Make a simple async task

How to use celery from django? There is an amazing documentation about that on celery’s own documentation page.

To make this post post short, I will not get to details, but do basic stuff for making a simple async task using celery.

Lets add a inside mydjango>mydjango directory.

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import logging

logger = logging.getLogger("Celery")
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mydjango.settings')
app = Celery('mydjango')
app.config_from_object('django.conf:settings', namespace='CELERY')

def debug_task(self):
    print('Request: {0!r}'.format(self.request))

And create a new app named myapp, add it to django settings:

    'celery', # Don't forget to add celery

Now we need to install celery and redis by using:

pip install celery
pip install redis

or we can add them to config>requirements.pip.

Now lets add a simple email sending task in src>mydjango>myapp>

from __future__ import absolute_import, unicode_literals
import logging

from django.conf import settings
from mydjango.celery import app

logger = logging.getLogger("celery")

def show_hello_world():"-"*25)"Printing Hello from Celery")"-"*25)

Thats it, that should do the trick. Now just run docker-compose build and docker-compose run to make the project running. If you do then you should see an output like this:



In first step, we have updated Dockerfile which was responsible for building django application’s environment.

FROM python:3.6
RUN mkdir /src
RUN mkdir /static
ADD ./src /src
RUN pip install -r requirements.pip
CMD python collectstatic --no-input;python migrate; gunicorn mydjango.wsgi -b & celery worker --app=myapp.tasks

Here we are using python 3.6 image and inside it creating src & static directory. After that, we have added local src directory to docker’s src, and made it working directory(whenever you go into docker, you will be directly inside src folder). Then we installed the requirements. Finally, the CMD command runs collect static, migration, gunicorn and in the end creates celery workers.

In docker-compose.yml, we have are adding nothing new from last step.

Now the new celery will be running in the old django container.

Celery will run this command: celery worker --app=myapp.tasks, which will execute tasks within an app named myapp.

Need proof that this works

Go to this github link and pull and build. Don’t forget to update email configurations inside the settings of django.

Checkout previous posts about docker:

  1. Deploy Django, Gunicorn, NGINX, Postgresql using Docker
  2. Serve Static Files by Nginx from Django using Docker

Docker Django Celery

Share Your Thoughts
M ↓   Markdown

Thursday, Jun 15, 2017

I’m getting an error : ModuleNotFoundError: No module named ‘utility’, with celery, why is that ?

Wednesday, Apr 25, 2018

It should be fixed now. Please pull the latest code from repository

Thursday, Mar 26, 2020

you should not use heroku for glog comments it starts too late and i cannot see my previous 2 attempts of commenting..

i asked how to connect wirh psql after typing make shell-db

grate article anyway!!!!!!!!!!

Friday, Mar 27, 2020

You can connect like this:

First, connect to shell:

docker exec -ti <container id> /bin/sh

Or use make shell-db

Then, run psql to access postgres. Or if you have set up POSTGRES_USER, POSTGRES_DB and POSTGRES_PASSWORD, then use psql -d <value of POSTGRES_DB> -U <value of POSTGRES_USER> -W. It will prompt you to input your password.

On the other note, I am using Heroku for its free dynos. If you can suggest something as free alternative, I would appreciate that :)