14

Do you know/use any distributed job queue for python? Can you share links or tools

9 Answers9

12

Pyres is a resque clone built in python. Resque is used by Github as their message queue. Both use Redis as the queue backend and provide a web-based monitoring application.

http://binarydud.github.com/pyres/intro.html

optixx
  • 2,110
  • 3
  • 16
  • 16
4

In addition to multiprocessing there's also the Celery project, if you're using Django.

Vinay Sajip
  • 95,872
  • 14
  • 179
  • 191
  • Thanks for the link? Is it strictly used with django? can we use it for standard python projects? –  Aug 28 '09 at 03:20
  • I don't see why not, with suitable adaptation (not sure how much work that'll be - depends on your exact requirement). – Vinay Sajip Aug 28 '09 at 08:20
  • Celery has an underlying library, called Carrot, that you can use without Django. – Alex Gaynor Jan 08 '10 at 03:56
  • 1
    Both Celery and Carrot works without Django. Or that is, you can use it from outside of a Django project. Recently someone even implemented paste support: http://bitbucket.org/twillis/celery-paste/ – asksol Feb 01 '10 at 16:25
  • 1
    Celery is now *designed* to be used outside of Django (but still has Django support if you need it) – Matthew Wilcoxson Oct 28 '12 at 17:36
3

There's also "bucker" by Sylvain Hellegouarch which you can find here:

It describes itself like this:

  • bucker is a queue system that supports multiple storage for the queue (memcached, Amazon SQS for now) and is driven by XML messages sent over a TCP connections between a client and the queue server.
Michael Sparks
  • 634
  • 5
  • 9
2

Look at beanstalkd

nos
  • 223,662
  • 58
  • 417
  • 506
2

redqueue? It's implemented in python+tornado framework, speaks memcached protocol and is optionally persistent into log files. Currently it is also able to behave like beanstalkd, the reserve/delete way in memcache protocol as well.

REDQUEUE

superisaac
  • 21
  • 1
  • At present, I have been using celery. I will look at the redqueue. Thanks for answer –  Jan 08 '10 at 15:03
2

If you think that Celery is too heavy for your needs then you might want to look at the simple distributed task queue:

versale
  • 61
  • 5
1

It's a year late or whatever, but this is something I've hacked together to make a queue of Processes executing them only X number at a time. http://github.com/goosemo/job_queue

Morgan
  • 4,143
  • 27
  • 35
-1

You probably want to look at multiprocessing's Queue. Included in Python 2.6, get it on PyPI for earlier versions of Python.

Standard library documentation: http://docs.python.org/library/multiprocessing.html On PyPI: http://pypi.python.org/pypi/multiprocessing

djc
  • 11,603
  • 5
  • 41
  • 54
-3

Also there is Unix 'at'

For more info: man at

mac2017
  • 445
  • 1
  • 6
  • 16