Docker: Use Celery in Django(Redis as Broker)
Nov 14, 2016 · 3 Min Read · 4 Likes · 4 CommentsIn previous two posts, we have deployed Django with Postgres, Nginx, now its time to do some async stuff using Celery. In this post, I will do the magic tricks first, explain them later.
Add ‘Celery’ to django
To add celery, we need make a container for celery. We can re-use the Dockerfile
of django for making celery’s container like this:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV C_FORCE_ROOT true
RUN mkdir /src
RUN mkdir /static
WORKDIR /src
ADD ./src /src
RUN pip install -r requirements.pip
CMD python manage.py collectstatic --no-input;python manage.py migrate; gunicorn mydjango.wsgi -b 0.0.0.0:8000 & celery worker --app=myapp.tasks
** FYI We are avoiding Python 3.7 or **latest** docker image, because celery conflicts with Python’s async
api(reference in github).
Now lets update the docker-compose.yml
file to link django with celery, also link redis container to celery.
version: "2"
services:
nginx:
image: nginx:latest
container_name: nz01
ports:
- "8000:8000"
volumes:
- ./src:/src
- ./config/nginx:/etc/nginx/conf.d
depends_on:
- web
web:
build: .
container_name: dz01
depends_on:
- db
volumes:
- ./src:/src
expose:
- "8000"
links:
- redis
db:
image: postgres:latest
container_name: pz01
redis:
image: redis:latest
container_name: rz01
ports:
- "6379:6379"
Now in django project, lets add broker url for celery in settings.py
:
CELERY_BROKER_URL = 'redis://redis:6379/0'
CELERY_RESULT_BACKEND = 'redis://redis:6379/0'
Integration with celery from django is complete. You don’t need to read the next section if you already have celery tasks.
Make a simple async task
How to use celery from django? There is an amazing article about that on celery’s own documentation page.
To make this post post short, I will not get to details, but do basic stuff for making a simple async task using celery.
Lets add a celery.py
inside mydjango>mydjango directory.
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import logging
logger = logging.getLogger("Celery")
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mydjango.settings')
app = Celery('mydjango')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
And create a new app named myapp
, add it to django settings:
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'myapp.apps.MyappConfig',
'celery', # Don't forget to add celery
]
Now we need to install celery
and redis
by using:
pip install celery
pip install redis
or we can add them to config>requirements.pip.
Now lets add a simple email sending task in src>mydjango>myapp>tasks.py
from __future__ import absolute_import, unicode_literals
import logging
from django.conf import settings
from mydjango.celery import app
logger = logging.getLogger("celery")
@app.task
def show_hello_world():
logger.info("-"*25)
logger.info("Printing Hello from Celery")
logger.info("-"*25)
Thats it, that should do the trick. Now just run docker-compose build
and docker-compose run
to make the project running. If you do then you should see an output like this:
Explanations
In first step, we have updated Dockerfile
which was responsible for building django application’s environment.
FROM python:3.6
ENV PYTHONUNBUFFERED 1
ENV C_FORCE_ROOT true
RUN mkdir /src
RUN mkdir /static
WORKDIR /src
ADD ./src /src
RUN pip install -r requirements.pip
CMD python manage.py collectstatic --no-input;python manage.py migrate; gunicorn mydjango.wsgi -b 0.0.0.0:8000 & celery worker --app=myapp.tasks
Here we are using python 3.6 image and inside it creating src & static directory. After that, we have added local src directory to docker’s src, and made it working directory(whenever you go into docker, you will be directly inside src folder). Then we installed the requirements. Finally, the CMD command runs collect static, migration, gunicorn and in the end creates celery workers.
In docker-compose.yml
, we have are adding nothing new from last step.
Now the new celery will be running in the old django container.
Celery will run this command: celery worker --app=myapp.tasks
, which will execute tasks within an app named myapp.
Need proof that this works
Go to this github link and pull and build. Don’t forget to update email configurations inside the settings of django.
Checkout previous posts about docker:
- Deploy Django, Gunicorn, NGINX, Postgresql using Docker
- Serve Static Files by Nginx from Django using Docker
Last updated: Nov 08, 2024
I won't spam you. Unsubscribe at any time.