I have several Unix servers which has an application running on them, I need to grep some pattern on each server from application logs and put the grep result of all the servers into a single consolidated file.
This is how i'm currently doing it.
def run_command(command):
ps = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE,shell=True)
out,err = ps.communicate()
if err != "":
return err
else:
return out
Server_List = [['ServerA','BecomeAccountA'],['ServerB','BecomeAccountB'],['ServerC','BecomeAccountC'],['ServerD','BecomeAccountD']]
Final_Result = ""
path = "some/path/"
pattern = "FindMe"
for list in Server_List:
server= list[0]
becomeaccount = list[1]
command="ssh -oConnectTimeout=5 -oBatchMode=yes -l %s %s 'grep %s %s'" % (becomeaccount,server,pattern,path)
result = run_command(command)
Final_Result+=result
with open("/some/path/output",'w') as f:
f.write(Final_Result)
Now my output
file contains following contents:
14012015.1449.30 [INFO] something FindMe something
14012015.1449.40 [INFO] something FindMe something
14012015.1450.13 [INFO] something FindMe something
14012015.1450.48 [INFO] something FindMe something
14012015.1451.04 [INFO] something FindMe something
14012015.1451.19 [INFO] something FindMe something
14012015.1451.77 [INFO] something FindMe something
14012015.1452.09 [INFO] something FindMe something
To get to this result in output
file, i have to make ssh connections to all the servers one after the other which takes sometime to process. I need to reduce the time taken by my code in order to get the final output. I was wondering can i do this in multi threading? I mean making multiple ssh connections at a time? I have never tried mutli threading.
Note:- the order of the lines in output
file is not important, so the order of ssh connections is also not required, because i can always sort the lines in output
file with the time as it has timestamp at the beginning of each line.