Skip to Content

Technology Blog

Technology Blog

Simple Job Queues with django_rq

Recently updated on

The de facto solution for job queues with background workers is Celery and RabbitMQ, but it is not the right fit for every project. RQ is an alternative to Celery, and while not as featureful, does provide a lightweight solution that makes it easy to set up and use. RQ is written in Python and uses Redis as its backend for establishing and maintaining the job queue. There is a great package that provides RQ integration into your Django project, Django-RQ.

Installing django-rq is simple!

pip install django-rq

Configuration of django-rq is simple!

Add ‘django-rq’ to your installed apps

INSTALLED_APPS = (
   ...
   "django_rq",
)

Configure one or more RQ connections
You can easily configure more than one named queue.

RQ_QUEUES = {
     'default': {
     'HOST': 'localhost',
     'PORT': 6379,
     'DB': 8,
     },
}

Django-RQ provides a new management command ‘rqworker’ that is used to create background workers to churn through the jobs in your queue. Starting up workers is as simple as: “python manage.py rqworker default” where default is the name of the queue you would like to use. You can pass more than one queue name to rqworker and it will start a worker for each queue.

Any standalone function can be enqueued in RQ and enqueuing jobs is easy! First lets create a tasks.py file (name is arbitrary) and create a function.

# tasks.py
def add(x, y):
    print "The output is: ", x + y

The django_rq package provides an easy interface to get named queues and add a job to the queue. First, we create a queue object. get_queue takes in the name of the queue we wish to use.

>>> import django_rq
>>> queue = django_rq.get_queue('default')

Now that we have created our queue object, we can enqueue a job.

>>> import tasks
>>> queue.enqueue(tasks.add, 1, 2)
<rq.job.Job object at 0x302ef10>

enqueue() returns a job object that provides a variety of information about the job’s status, parameters, etc. equeue() takes the function to be enqueued as the first parameter, then a list of arguments. You can also pass arguments in keyword arguments, so I could have enqueued the job using queue.enqueue(tasks.add, x=1, y=2). If you need to pass some additional options to enqueue(), use enqueue_call() instead (more information here).

Django-RQ also provides a set of views and urls that can provide information about completed and failed jobs. Just add the django-rq urls to your site’s urls.py. Enabling these urls will require the use of Django’s admin interface, so make sure that is enabled in your installed apps and urls.

For more information about Django-RQ, check out its Github page. There is some great information available there about configuring logging and testing. Also, if you are interested in logging to Sentry, check out this thread. There is also another great project, called rq-scheduler, for allowing jobs to be performed at specific times in the future.
 


Share , ,
If you're getting even a smidge of value from this post, would you please take a sec and share it? It really does help.