19

Imagine a situation in which a user performs an action on a website and admins are notified. Imagine there are 20 admins to notify. By using normal methods for sending emails with Django the user will have to wait until all the emails are sent before being able to proceed.

How can I send all the emails in a separate process so the user doesn't have to wait? Is it possible?

the_drow
  • 18,571
  • 25
  • 126
  • 193
nemesisdesign
  • 8,159
  • 12
  • 58
  • 97
  • 1
    Alternative (simple) solution could be to send an email to a Gmail (or other) address, which will then use a rule or something and send it on to all admins – Josh Hunt Oct 02 '11 at 12:05
  • 1
    Sending to a Gmail address is not really a good solution at all. It would cause a lot of other problems, such as keeping the gmail-address in sync with the admins as they change. Gmail can also be unpredictably slow and/or have high latencies or even be down, which would cause a lot of unpredicted errors and slowness for the users. – andreaspelme Oct 02 '11 at 12:24

3 Answers3

25

Use celery as a task queue and django-celery-email which is an Django e-mail backend that dispatches e-mail sending to a celery task.

andreaspelme
  • 3,270
  • 23
  • 12
  • can I ask one thing? If I have understood correctly I would not need to change the code of my app to implement celery, except for my settings.py. Correct? – nemesisdesign Oct 02 '11 at 13:01
  • 3
    As long as you are using the `django.core.mail` API you will not have to change anything in your code. The alternative email backend takes care of the celery integration. You can however, (easily) write other arbitrary celery tasks to be executed in the background, outside of the web process, which can be very handy. – andreaspelme Oct 02 '11 at 13:09
  • I got OperationalError: [Errno 111] Connection refused in kombu. – Paul R Feb 01 '17 at 12:37
7

Another option is django-mailer. It queues up mail in a database table and then you use a cron job to send them.

https://github.com/pinax/django-mailer

Jesse
  • 373
  • 3
  • 5
5

If we are talking about to send only 20 mails time by time, a thread may be a possible solution. For expensive background tasks use Celery.

This is a sample using thread:

# This Python file uses the following encoding: utf-8

#threading
from threading import Thread

...

class afegeixThread(Thread):
    
    def __init__ (self,usuari, parameter=None):
        Thread.__init__(self)
        self.parameter = parameter
        ...
      
    def run(self):        
        errors = []
        try:
             if self.paramenter:
                   ....
        except Exception, e:                
             ...
...

n = afegeixThread( 'p1' )
n.start()
dani herrera
  • 48,760
  • 8
  • 117
  • 177
  • 3
    This is possible, but the work is still done in a web server process, which is not ideal for background tasks. If you have the ability you should set up a proper task queue and off-load the web processes as much as possible. – andreaspelme Oct 02 '11 at 13:14
  • I'm not sure exactly what you are asking, but celery is built for handling async workloads, outside of the web processess, and it is really easy and straightforward to get going, see the celery docs. It is usually a good practice to split up the heavy jobs into smaller jobs if possible. Celery can then be run on multiple machines and consume tasks in parallell, making it very easy to scale and handle heavy jobs! This can for instance be applied the video uploads, image resizing, sending e-mails, generating PDF:s or other similar heavy things! – andreaspelme Oct 02 '11 at 13:49