For my script I need to use Python 2.6 with only the standard library. I am trying to write a script that walks a logs directory that has a condition defined that only matches the logs that have the appropriate timestamp. The timestamp I am using is derived from the filename. I do not want to use the OS timestamps because there are times the files get copied off to a different directory to prevent them from being overwritten and this changes the file modified time.
A new file gets created every 200MB. The timestamp on the filename is the time the file was created and represents the oldest log entry in the file.
import datetime
# One event might span multiple log files.
call_start = datetime.datetime(2018, 5, 15, 5, 25, 9)
call_stop = datetime.datetime(2018, 5, 15, 5, 37, 38)
# Timestamp values of file generated from file's naming convention
t1 = datetime.datetime(2018, 5, 15, 4, 48, 16)
t2 = datetime.datetime(2018, 5, 15, 5, 3, 53)
t3 = datetime.datetime(2018, 5, 15, 5, 19, 14)
t4 = datetime.datetime(2018, 5, 15, 5, 35)
t5 = datetime.datetime(2018, 5, 15, 5, 49, 19)
file_times = [t1, t2, t3, t4, t5]
matching_times = []
for ftime in file_times:
# Logic I can't figure out
if scratches_head:
matching_times.append(ftime)
# I would expect the matching_times list to contain t3 and t4
Edit
Clarification from the comments:
t3
is a file that was created at 5:19:14am
. The call_start
is the first entry I would see in the log. It begins at 5:25:09am
. Since t4
didn't get created until 5:35:00am
, the call_start
has to be in t3
. The call_stop
is the last log entry I want to find. I would be in t4
because t5
was created at 5:49:19am
.