4

I'm trying to create a central logging system for some of my python modules. I want to be able to send messages from a number of modules with logs, then central logger takes them and does the processing.

enter image description here

For the simplicity, I want my module A to look something like this:

  bus = connect_to_a_bus_that_is_always_there
  while True:
    #Publish a message to message bus, pseudo code
    bus.publish(topic="logs.a", message="example")
    time sleep(1)

and the logger(the only subscriber)

def actOnNewMessage(msg):
  if msg.topic.subtopic == "a":
     doSomethingForA(msg.data)

bus = connect_to_a_bus_that_is_always_there
bus.subscribe("logs", handler=actOnNewMessage)

while True:
  #wait for messages

Right now Logger module act like a library, so it's not persistently run, so maybe I can introduce something in between Logger and Message Bus, that would be constantly watching for new messages.

I have looked at PyPubSub but it doesn't seem to introduce the persistent communication between different running python modules in documentation. If anyone tried this, it works for me if I can use this between different modules.

Another catch is that I might end up with modules not written in python, so I don't really want a direct communication between modules A,B and Logger. In the end my architecture might look like that: enter image description here

I hope the information above is not confusing.

tl;dr: Publish-Subscribe with persistent message bus in python and a subscriber that is constantly waiting for new messages. Any ready-to-use solution?

EDIT: I'm considering running a web socket server that knows about Logger module, and other modules A, B know the address of the websocket. Are there any drawbacks to this design?

akalikin
  • 1,071
  • 12
  • 35
  • I don't totally understand - are A and B separate processes? I'm assuming the logger is a separate process? In terms of the transport mechanism - would you use any sort of 3rd party product (eg Redis)? – Aidan Kane Aug 06 '15 at 12:59
  • Sorry, just saw that you said the Logger is a library - so, not a separate process. – Aidan Kane Aug 06 '15 at 13:01
  • @AidanKane yes, A and B are separate processes doing different jobs, and unfortunately I can't use any 3rd party products – akalikin Aug 06 '15 at 13:04
  • Ok. I posted a redis one as an answer but I guess that's out of the question. Do you have any more info about the characteristics you need. Is there a lot of logging going on? Can you write log files to disk and tail those? – Aidan Kane Aug 06 '15 at 13:10
  • I think the web socket approach would be fine- that's where I would probably look next if I wasn't allowed to use redis (though I'm not an expert in this area). – Aidan Kane Aug 06 '15 at 13:29
  • Take the following with a pinch of salt but depending on the reason for the persistent connection part of the requirement you could use UDP. The tradeoff would be that messages might go missing. See here for a good basic example http://www.binarytides.com/programming-udp-sockets-in-python/ – Aidan Kane Aug 06 '15 at 13:30

3 Answers3

1

Opensplice is a message bus that allows persisent, buffered data communication. Do not roll your own message bus! They are complicated beasts.

Why not simply use syslog? There are versions of syslog that also support logging from multiple nodes to a central collection point. Many programming languages have support for it, including python.

I would strongly recommend you to use the standard python logging framework. It allows you to chose where logs go using various standard loggers, such as the SyslogHandler, the SocketHandler and the DatagramHandler.

It even allows you to write your own handler, if you must...

EvertW
  • 1,160
  • 9
  • 18
1

I've come across nanomsg. Perfectly suits my needs, with MIT license and no additional servers running. In addition there are bindings for any language I would like to use.

from nanomsg import Socket, PUB

s = Socket(PUB)
s.connect('tcp://localhost:8080')
s.send('topicMessage')

from nanomsg import Socket, SUB

s = Socket(SUB)
s.connect('tcp://localhost:8080')
s.set_string_option(SUB, SUB_SUBSCRIBE, "topic")
while True:
    print(s.recv())
akalikin
  • 1,071
  • 12
  • 35
  • 2
    Good suggestion. The only problem: No security. Any process can open a connection and send or receive data. I just checked it. – Regis May Feb 13 '19 at 11:44
0

You could redis as a broker and run logger.py in a separate process.

logger.py

import redis

r = redis.Redis()

while True:
    next_log_item = r.blpop(['logs'], 0)
    write_to_db(next_log_item)

a.py

import redis
import time

r = redis.Redis()

while True:
    r.rpush('logs', message)
    time.sleep(1)
Aidan Kane
  • 3,856
  • 2
  • 25
  • 28
  • Thanks for the suggestion, this solution would be ideal if there could be a Reddis-like server that I can simply initiate in a python module somewhere – akalikin Aug 06 '15 at 13:17
  • So that's basically what I'm looking for, but just slightly not the right format. For web sockets I can use Bottle, and something like this would be really good. – akalikin Aug 06 '15 at 13:19