58

I want to use Amazon SQS as broker backed of Celery. There’s the SQS transport implementation for Kombu, which Celery depends on. However there is not enough documentation for using it, so I cannot find how to configure SQS on Celery. Is there somebody that had succeeded to configure SQS on Celery?

Community
  • 1
  • 1
minhee
  • 5,688
  • 5
  • 43
  • 81

7 Answers7

62

I ran into this question several times but still wasn't entirely sure how to setup Celery to work with SQS. It turns out that it is quite easy with the latest versions of Kombu and Celery. As an alternative to the BROKER_URL syntax mentioned in another answer, you can simply set the transport, options, user, and password like so:

BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT_OPTIONS = {
    'region': 'us-east-1',
}
BROKER_USER = AWS_ACCESS_KEY_ID
BROKER_PASSWORD = AWS_SECRET_ACCESS_KEY

This gets around a purported issue with the URL parser that doesn't allow forward slashes in your API secret, which seems to be a fairly common occurrence with AWS. Since there didn't seem to be a wealth of information out there about the topic yet, I also wrote a short blog post on the topic here:

http://www.caktusgroup.com/blog/2011/12/19/using-django-and-celery-amazon-sqs/

tobias.mcnulty
  • 1,621
  • 14
  • 15
  • 2
    +1 thank you for your work on this @tobias. Good blog post and the discussion in the comments is very informative. Keep us updated! – JCotton Feb 06 '12 at 21:09
  • Has amazon SQS gotten any faster? I haven't used it yet mainly because I hear a lot of reports of terrible latency ( > 2 minutes), before a task shows up in a queue, for example. – Andres May 10 '13 at 19:51
  • This doesnt work for me. There is error in Boto's authentication system - No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV4Handler'] Check your credentials – iankit Nov 16 '15 at 07:11
  • 1
    What about if we have multiple SQS queues. How can we tell python which queue to use? – alexislg Jul 03 '17 at 07:49
32

I'm using Celery 3.0 and was getting deprecation warnings when launching the worker with the BROKER_USER / BROKER_PASSWORD settings.

I took a look at the SQS URL parsing in kombo.utils.url._parse_url and it is calling urllib.unquote on the username and password elements of the URL.

So, to workaround the issue of secret keys with forward slashes, I was able to successfully use the following for the BROKER_URL:

import urllib
BROKER_URL = 'sqs://%s:%s@' % (urllib.quote(AWS_ACCESS_KEY_ID, safe=''),
                               urllib.quote(AWS_SECRET_ACCESS_KEY, safe=''))

I'm not sure if access keys can ever have forward slashes in them but it doesn't hurt to quote it as well.

CalloRico
  • 1,158
  • 12
  • 9
  • This worked with a slash in my secret key on celery 3.1 – Pim Apr 15 '15 at 22:30
  • From the celery 4.1.0 docs: `The login credentials can also be set using the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, in that case the broker URL may only be sqs://.` I could not get this to work, but explicitly building the BROKER_URL as this answer does worked beautifully. – bluescores Jan 15 '18 at 17:20
  • 2
    https://stackoverflow.com/questions/31827012/python-importing-urllib-quote#31827113 For python 3 – Dimitrios Mistriotis May 30 '18 at 15:53
2

Nobody answered about this. Anyway I tried to configure Celery with Amazon SQS, and it seems I achieved a small success.

Kombu should be patched for this, so I wrote some patches and there is my pull request as well. You can configure Amazon SQS by setting BROKER_URL of sqs:// scheme in Celery on the patched Kombu. For example:

BROKER_URL = 'sqs://AWS_ACCESS:AWS_SECRET@:80//'
BROKER_TRANSPORT_OPTIONS = {
    'region': 'ap-northeast-1',
    'sdb_persistence': False
}
Jordan
  • 31,971
  • 6
  • 56
  • 67
minhee
  • 5,688
  • 5
  • 43
  • 81
  • 1
    Awesome! tried out your patched branch, SQS finally worked, but the worker kept getting the same task over and over (with same taskid even!). So i went back to using rabbitmq as broker. Are you using SQS+celery in production? Facing any such issues? – sajal Dec 04 '11 at 20:23
  • 1
    @sajal I faced the exactly same issue, so I finally went to back to using RabbitMQ. – minhee Dec 05 '11 at 09:44
  • @sajal: When creating your SQS queue you need to set the Default Visibility timeout to some time that's greater than the max time you expect a task to run. This is the time SQS will make a message invisible to all other consumers after delivering to one consumer. I believe the default is 30 seconds. So, if a task takes more than 30 seconds, SQS will deliver the same message to another consumer because it assumes the first consumer died and did not complete the task. – Gustavo Ambrozio Jun 04 '12 at 16:56
2

For anybody stumbling upon this question, I was able to get Celery working out-of-the-box with SQS (no patching required), but I did need to update to the latest versions of Celery and Kombu for this to work (1.4.5 and 1.5.1 as of now). Use the config lines above and it should work (although you'll probably want to change the default region).

Gotcha: in order to use the URL format above, you need to make sure your AWS secret doesn't contain slashes, as this confuses the URL parser. Just keep generating new secrets until you get one without a slash.

nitwit
  • 1,745
  • 2
  • 17
  • 20
  • 7
    an answer should stand on its own as a self-contained response. It should contain all the information required to be understood. Referencing something else, somewhere else ("config lines above") isn't helpful, especially considering the position of answers on a page are dynamic. – JCotton Feb 06 '12 at 20:59
0

I regenerated the credentials in the IAM consonle until I got a key without a slash (/). The parsing issues are only with that character, so if your secret doesn't have one you'll be fine.

Not the most terribly elegant solution, but definitely keeps the code clean of hacks.

WhyNotHugo
  • 9,423
  • 6
  • 62
  • 70
0

Update for Python 3, removing backslashes from the AWS KEY.

from urllib.parse import quote_plus
BROKER_URL = 'sqs://{}:{}@'.format(
    quote_plus(AWS_ACCESS_KEY_ID), 
    quote_plus(AWS_SECRET_ACCESS_KEY)
)
bones225
  • 1,488
  • 2
  • 13
  • 33
0

I was able to configure SQS on celery 4.3 (python 3.7) by using kombu.

from kombu.utils.url import quote

CELERY_BROKER_URL = 'sqs://{AWS_ACCESS_KEY_ID}:{AWS_SECRET_ACCESS_KEY}@'.format(
    AWS_ACCESS_KEY_ID=quote(AWS_ACCESS_KEY_ID, safe=''),
    AWS_SECRET_ACCESS_KEY=quote(AWS_SECRET_ACCESS_KEY, safe='')
)