[PYTHON] I tried using Amazon SQS with django-celery

I use django-celery + RabbitMQ for my current job, but since I am using AWS for the infrastructure, I was wondering if it could be linked with Amazon SQS, so I tried it.

The source code can be found on GitHub.

Preparation

Install the required libraries with pip.

$ pip install django
$ pip install django-celery
$ pip install boto
$ pip install django-dotenv
$ pip freeze > requirements.txt

Create a sample project

$ django-admin.py startproject demo .
$ python manage.py startapp items

Edit settings.py

Before editing settings.py, we will use Amazon SQS, so prepare the access key and secret key. You can write the access key etc. directly in settings.py, but I didn't want to manage it with git, so I tried using django-dotenv.

The .env file looks like this:

.env


AWS_SQS_ACCESS_KEY=XXXXX
AWS_SQS_SECRET_ACCESS_KEY=YYYYY
AWS_SQS_REGION=ap-northeast-1

Then edit settings.py.

demo/settings.py


#import added
import os
import urllib

〜

#DB is sqlite3
DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': 'demo.db',
    }
}

〜

#Change the timezone for the time being
TIME_ZONE = 'Asia/Tokyo'

〜

INSTALLED_APPS = (
    〜
    #Add the following
    'djcelery',
    'items',
)

〜

#Add the following at the end
# urllib.I use quote because the access key may contain a string that requires URL encoding.
import djcelery
djcelery.setup_loader()

BROKER_URL = 'sqs://%s:%s@' % (urllib.quote(os.environ['AWS_SQS_ACCESS_KEY'], ''),
                               urllib.quote(os.environ['AWS_SQS_SECRET_ACCESS_KEY'], ''))
BROKER_TRANSPORT_OPTIONS = {'region': os.environ['AWS_SQS_REGION'],
                            'queue_name_prefix': 'celery_sqs_demo-'}

Then add the following to manage.py:

manage.py


import dotenv

dotenv.read_dotenv() 

The .env file is now loaded.

Create application

model

Appropriate.

items/models.py


from django.db import models

class Item(models.Model):
    uuid = models.CharField(max_length=200)
    created_at = models.DateTimeField('created at')

task

Create a task to process asynchronously. It seems that this file needs to be named ** tasks.py **.

items/tasks.py


import time
from celery import task
from items.models import Item

@task
def add_item(uuid, created_at):
    time.sleep(3)
    item = Item(uuid=uuid, created_at=created_at)
    item.save()

View

This is also appropriate.

items/views.py


import uuid
from django.http import HttpResponse
from django.utils import timezone
from items.tasks import add_item

def index(request):
    add_item.delay(uuid.uuid4(), timezone.now())
    return HttpResponse('success')

Also edit urls.py.

demo/urls.py


from django.conf.urls import patterns, include, url

# Uncomment the next two lines to enable the admin:
# from django.contrib import admin
# admin.autodiscover()

urlpatterns = patterns('',
    url(r'^items/$', 'items.views.index'),
)

Run

Before that, prepare the DB. A celery-related table is also created.

$ python manage syncdb

Start celeryd. Add -l info to check the log.

$ python manage.py celeryd

At first, celeryd did not start and the following error occurred.

ValueError: invalid literal for int() with base 10: ‘XXXXXXXXXX’


 > The cause is also written in the editing of settings.py, but it is because the AWS access key and secret key sometimes contain strings that require URL encoding.

 Once celeryd starts successfully, start the Django application.

$ python manage.py runserver


 Once started, try accessing http: // localhost: 4567 / items /.
 If you check the log of celeryd, you can see the following log, and you can confirm that you can take the message.

[2014-01-25 15:38:27,668: INFO/MainProcess] Received task: items.tasks.add_item[XXXXXXXXXX] [2014-01-25 15:38:30,702: INFO/MainProcess] Task items.tasks.add_item[XXXXXXXXXX] succeeded in 3.031301742s: None


 If you check the DB as well, the record is created properly.

$ sqlite3 demo.db

sqlite> select * from items_item; 1|c23bd4f4-720f-4488-a6b9-dc26ed495c71|2014-01-25 06:38:26.908489


 When I checked SQS, the following two queues were created.

* celery_sqs_demo-celery
 * celery_sqs_demo-celery_ {hostname} -celery-pidbox


#### **`celery_sqs_The demo part is settings.BROKER with py_TRANSPORT_It will be the prefix set in OPTIONS.`**
```BROKER with py_TRANSPORT_It will be the prefix set in OPTIONS.

 I'm not sure what the queue celery_ {hostname} -celery-pidbox is for, so I'll look it up.

# Summary

 I thought it would take more time, but I was able to confirm the operation easily without any big addiction points.
 However, since the cooperation with SQS is still Experimental, I can't wait to become Stable.
 Next time, I would like to send messages from multiple hosts and try to see if there are duplicate tasks.

# reference

* [First steps with Django - Celery 3.1.8 documentation](http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#using-celery-with-django)
* [Using Amazon SQS - Celery 3.1.8 documentation](http://docs.celeryproject.org/en/latest/getting-started/brokers/sqs.html)
 * [Asynchronous processing quick start guide with django-celery --hirokiky's blog](http://blog.hirokiky.org/2013/03/23/quick_start_guid_about_django_celery.html)

Recommended Posts

I tried using Amazon SQS with django-celery
I tried using Amazon Glacier
I tried using Selenium with Headless chrome
I tried using parameterized
I tried using argparse
I tried using mimesis
I tried using anytree
I tried using aiomysql
I tried using Summpy
I tried using coturn
I tried using Pipenv
I tried using matplotlib
I tried using "Anvil".
I tried using Hubot
I tried using ESPCN
I tried using openpyxl
I tried using Ipython
I tried using PyCaret
I tried using cron
I tried using ngrok
I tried using face_recognition
I tried using Jupyter
I tried using PyCaret
I tried using Heapq
I tried using doctest
I tried using folium
I tried using jinja2
I tried using folium
I tried using time-window
I tried using mecab with python2.7, ruby2.3, php7
I tried to start Jupyter with Amazon lightsail
I tried DBM with Pylearn 2 using artificial data
I tried using a database (sqlite3) with kivy
I tried fp-growth with python
I tried scraping with Python
I tried Learning-to-Rank with Elasticsearch!
[I tried using Pythonista 3] Introduction
I tried Amazon Comprehend sentiment analysis with AWS CLI.
I tried using easydict (memo).
I tried face recognition using Face ++
I tried using Random Forest
I tried clustering with PyCaret
I tried using BigQuery ML
I tried using git inspector
[Python] I tried using OpenPose
I tried using magenta / TensorFlow
I tried gRPC with Python
I tried scraping with python
I tried using AWS Chalice
I tried using Slack emojinator
I tried using the Python library from Ruby with PyCall
I tried sending an email from Amazon SES with Python
I tried using the DS18B20 temperature sensor with Raspberry Pi
I tried handwriting recognition of runes with CNN using Keras
I tried using Rotrics Dex Arm # 2
I tried trimming efficiently with OpenCV
I tried summarizing sentences with summpy
I tried machine learning with liblinear
I tried web scraping with python.
I tried moving food with SinGAN
I tried using Rotrics Dex Arm