Distributed Celery Workers

I’m working a lot with Django and Celery recently. By the way, awesome Open Source projects!   Celery is a simple, flexible and reliable distributed system to process vast amounts of messages.

For my project I have multiple celery workers, whose resides on different maschines, so called a distributed setup. Now the tasks which celery will execute are not deployed on the task calling maschine itself. therefore the task schedules needs some tuning how they are being called.

A typical celery task example from the docs is to import the task functions or classes and apply .delay() on them.

from proj.celery import app
from proj.tasks import debug


Thats in my case not useful to have the task code residing on all of the task queuing applications and the celery worker server.   Also if another distributed celery worker wants to execute such a task.  

So how can I have for ex. a simple REST-API, which triggers some task in the celery backend?  

Signatures to the rescue. Celery have a Signature type for tasks. Basically it takes an task-name-string and arguments for the function or the task itself and creates a task object out of it to send it over the wire. Read about it here: Celery Signatures

You can create a signature for the example add task using its name:

from celery import signature
s = signature('proj.tasks.add', args=(2, 2), countdown=10)