I often need to look up log files, problem is the log files are burried in dozens of folders each and to look up 50 would take literally an hour or more.
I have been using a batch file to scan the drive overnight and compile a list of all directories in the following format
Z:\folder\folder2\folder3\folder4\folder5\folder6\folder7\ <about another 20 folders > \log.txt
Current command is:
dir /b /-d /-p /s /A:-D > directories.txt
The txt file has like 500 thousand lines of this.
Then when I need to look up a set of logs I would run another batch to pull a set of 50 logs based on scanning that txt file.
Problem with the current solution is that with the log database growing it is now taking 12+ hours to scan the directories. Which makes it un-runnable overnight. And I need to run this every night to keep the logs current.
Question:
So question to you guys, what is the best way to do this? I cant change any of the directory structures (this is a database of logs used by hundreds of people), and I dont really know any languages other than Batch scripting. But it seems like Batch is limited and doesnt allow me to do any of the following (which would solve my problem)
- Skip directories that have not been modified in the last 48 hours
- Skip subdirectories of folders with specific keywords in the name
If I could do the above 2 with batch, it would probably take the txt file output from 500 thousand lines to maybe 3 thousand.