Do you know/use any distributed job queue for python? Can you share links or tools
9 Answers
Pyres is a resque clone built in python. Resque is used by Github as their message queue. Both use Redis as the queue backend and provide a web-based monitoring application.

- 2,110
- 3
- 16
- 16
In addition to multiprocessing there's also the Celery project, if you're using Django.

- 95,872
- 14
- 179
- 191
-
Thanks for the link? Is it strictly used with django? can we use it for standard python projects? – Aug 28 '09 at 03:20
-
I don't see why not, with suitable adaptation (not sure how much work that'll be - depends on your exact requirement). – Vinay Sajip Aug 28 '09 at 08:20
-
Celery has an underlying library, called Carrot, that you can use without Django. – Alex Gaynor Jan 08 '10 at 03:56
-
1Both Celery and Carrot works without Django. Or that is, you can use it from outside of a Django project. Recently someone even implemented paste support: http://bitbucket.org/twillis/celery-paste/ – asksol Feb 01 '10 at 16:25
-
1Celery is now *designed* to be used outside of Django (but still has Django support if you need it) – Matthew Wilcoxson Oct 28 '12 at 17:36
There's also "bucker" by Sylvain Hellegouarch which you can find here:
It describes itself like this:
- bucker is a queue system that supports multiple storage for the queue (memcached, Amazon SQS for now) and is driven by XML messages sent over a TCP connections between a client and the queue server.

- 634
- 5
- 9
redqueue? It's implemented in python+tornado framework, speaks memcached protocol and is optionally persistent into log files. Currently it is also able to behave like beanstalkd, the reserve/delete way in memcache protocol as well.

- 21
- 1
-
At present, I have been using celery. I will look at the redqueue. Thanks for answer – Jan 08 '10 at 15:03
If you think that Celery is too heavy for your needs then you might want to look at the simple distributed task queue:

- 61
- 5
It's a year late or whatever, but this is something I've hacked together to make a queue of Processes executing them only X number at a time. http://github.com/goosemo/job_queue

- 4,143
- 27
- 35
You probably want to look at multiprocessing's Queue. Included in Python 2.6, get it on PyPI for earlier versions of Python.
Standard library documentation: http://docs.python.org/library/multiprocessing.html On PyPI: http://pypi.python.org/pypi/multiprocessing

- 11,603
- 5
- 41
- 54