I'm trying to create more separation of concerns with endpoints in my Django apps but not sure how to implement a service layer. I've read through a popular python article on this topic however it doesn't seem to answer the testing issue.
For instance, if I have a request come in to save a user and want to send an email after the user has been saved, it's popular to handle this logic by overriding on save like this:
**models.py**
class User(AbstractBaseUser, PermissionsMixin):
...
def save(self, *args, **kwargs):
if self._state.adding:
user.activate_user(user=self)
super(User, self).save(*args, **kwargs)
**services.py**
from services import iNotificationService
...
def activate_user(user):
user.active = True
iNotificationService().send_message(user)
In the above example, iNotificationService would be an interface that the application could choose at runtime. If this user was saved in a production environment, the application would provide a class like the following
import mandrill
class EmailClient(iNotificationService)
def send_message(user):
message = create_message(user)
mandrill.send_email(message)
into the services.py module so that an email was sent. But if the application was run in a testing environment, the test would mock the EmailClient by sending in an instance of the interface so that no email would actually be sent:
class iNotificationService(object)
def send_message(user)
pass
What I'm wondering is how I supply the instance to the services.py module so that activate_user knows which type of notification to send. My only idea around this was to pass some arg to the save method so that it would know which type of notification to use. But I'm wondering how scalable that solution would be considering the different places one might use a service.