Documenting this here because Celery’s documentation isn’t the best in general, but moreover because I hadn’t seen a write-up for this scenario – which I would imagine is not an uncommon situation.
So here’s the summarised scenario:
The challenge was to have 2+ Django sites on one VM without confusing the Celery backend. This requires the creation of a “Queue”, “Exchange”, and “Routing Key”. The Celery documentation gives major clues but doesn’t cover Django to any large degree. It does cover some first steps with Django, but nothing about deploying custom Queues in terms of Django python files and what goes where, or at least i couldn’t find it.
I’m sure there are many ways of achieving the same result but this is what worked for me…
The application won’t be able to find celery if you don’t initiate it. The project here is ‘netdelta’. Under the <app> dir create a file, say celery_app.py:
from __future__ import absolute_import, unicode_literals import os from celery import Celery from kombu import Exchange, Queue # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'netdelta.settings') app = Celery('nd') # Using a string here means the worker doesn't have to serialize # the configuration object to child processes. # - namespace='CELERY' means all celery-related configuration keys # should have a `CELERY_` prefix. app.config_from_object('django.conf:settings') # Load task modules from all registered Django app configs. #app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) app.conf.task_queues = ( Queue('cooler', Exchange('cooler'), routing_key='cooler'), ) app.conf.task_default_queue = 'cooler' app.conf.task_default_exchange_type = 'direct' app.conf.task_default_routing_key = 'cooler' @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request))
Which is called when the project fires up with use of __init.py__ under <app> (‘nd’ in this case):
from __future__ import absolute_import, unicode_literals # This will make sure the app is always imported when # Django starts so that shared_task will use this app. from .celery_app import app as celery_thang __all__ = ('celery_thang',)
<django root>/<proj>/<proj>/settings.py is where the magic happens and it turns out to be a few lines. ‘scheduled_scan’ is the <proj>.tasks.<celery_job_name>:
CELERY_QUEUES = {"coller": {"exchange": "cooler", "routing_key": "cooler"}} CELERY_ROUTES = { 'nd.tasks.scheduled_scan': {'queue': "cooler", 'exchange': "cooler", 'routing_key': "cooler"}, }
Now do the same for the other site(s).
Launch a worker thusly …
python manage.py celery worker -Q cooler -n cooler --loglevel=info -B
- ‘cooler’ is the site name
- ‘nd’ is the app name
The final step – your app.task has to call the custom queue you defined. In my case i could afford to set the default QUEUE to my wanted QUEUE because i had need for only one queue. But in multiple QUEUE scenarios, you will need to define the queue. I was setting jobs to run under a schedule using the djcelery admin panel that was created by default. The result are database entries – using an ‘INSERT’ statement to show the table structure (The queue, exchange, and routing key are highlighted):
INSERT INTO `djcelery_periodictask` (`id`, `name`, `task`, `args`, `kwargs`, `queue`, `exchange`, `routing_key`, `expires`, `enabled`, `last_run_at`, `total_run_count`, `date_changed`, `description`, `crontab_id`, `interval_id`) VALUES (2, 'Linode-67257', 'nd.tasks.scheduled_scan', '["Linode"]', '{}', 'coller', 'coller', 'coller', NULL, 1, '2017-10-15 23:40:00', 18, '2017-10-15 23:41:29', '', 2, NULL);
My virtualenv is as below (note i’m using the fork of libnmap that doesn’t use multiprocessor:
>pip list
- amqp (2.2.2)
- anyjson (0.3.3)
- appdirs (1.4.3)
- billiard (3.5.0.3)
- celery (4.1.0)
- Django (1.11.6)
- django-celery-beat (1.0.1)
- django-celery-results (1.0.1)
- django-dajaxice (0.7)
- html2text (2017.10.4)
- iptools (0.6.1)
- kombu (4.1.0)
- MySQL-python (1.2.5)
- netaddr (0.7.19)
- packaging (16.8)
- pip (9.0.1)
- pyparsing (2.2.0)
- python-libnmap (0.7.0)
- pytz (2017.2)
- setuptools (36.6.0)
- six (1.11.0)
- vine (1.1.4)
- wheel (0.30.0)