0

I am trying to create a script which will tail the logs from many components and write into a single file. I need to start this script when i ingest a video asset which is processed by many components. To start off, i am trying to tail only 1 component but does not seem to work. Can some one help me in this?

from __future__ import with_statement
from fabric.api import *
import re, ConfigParser, paramiko, xlwt, collections, os
import logging
from logging.config import fileConfig
logger = None

def connect():
    global config, logger, status, file_size
    config = ConfigParser.RawConfigParser()
    config.read('config.cfg')
    for section in sorted(config.sections(), key=str.lower):
        env.user = config.get(section, 'server.user_name')
        env.password = config.get(section, 'server.password')
        host = config.get(section, 'server.ip')
        print "Trying to connect to {} server.....".format(section)
    with settings(hide('warnings', 'running', 'stdout', 'stderr'),warn_only=True, host_string=host):
        try:
            files = run('tail -F /var/log/abc')
            with open("E2E_tmp"+".txt", "w") as fo:
                fo.write(files)
        except Exception as e:
            print e
            print 'Could not connect to {} server\n'.format(section)

if __name__ == "__main__":
    connect()

Config is like this

[Astro]
server.user_name = root
server.password = staines
server.ip = 10.209.17.113

Script will show

Trying to connect to Astro server.....

Instead of tail command, if i use 'ls' command, i can see the list in the text file. Please help me how to re direct tail from many files to a single or is there any better way to do it.

Note: I am running the script from a windows PC.

Richard
  • 2,994
  • 1
  • 19
  • 31
Roshan r
  • 522
  • 2
  • 11
  • 30
  • `tail` never returns, so you're stuck with a single server. If you want to monitor all files simultaneously, you need to launch a separate `Thread` for each of the servers. – Sergei Lebedev Mar 11 '16 at 13:40
  • @Sergei .. How can i tail logs using threads? – Roshan r Mar 11 '16 at 13:46
  • 1
    to feed a bunch of different processes into one (your seperate log tail -f into one file) make a queue, start the threads, write all output onto the queue and then consume the queue in another thread. Python Queue.Queue works great at this. There are lots of examples of this pattern on Stackoverflow – Vorsprung Mar 11 '16 at 13:59
  • @Vorsprung.. I haven't used Queues till now and not able to locate a correct example on using Queues from a windows machine and trying to tail the logs from a Linux machine. Can you point me to some document or any example? – Roshan r Mar 17 '16 at 10:38
  • some good stuff on threads/queues and python here http://stackoverflow.com/questions/2846653/how-to-use-threading-in-python/2846697#2846697 – Vorsprung Mar 17 '16 at 12:37

0 Answers0