0

I have a script that runs tcpdump indefinitely, and outputs to a capture.out file. I would like to write another Python script to monitor capture.out and iterate over a loop each time a new line (or even better, a new packet) is written to the file by the other script.

I know how to loop through lines in a file, but I am not sure how to continuously monitor a file and iterate only when a new line (or packet) is written by the other script.

My ultimate goal is to publish each packet captured over MQTT (filtering out MQTT traffic of course), so if there is a more efficient solution to my end goal here, such as bypassing an output file and a simple way to make a Python function call on each packet captured by tcpdump, that would be even better.

vincebel
  • 132
  • 14
  • 1
    Can you pipe it through `tail -f`? Or pipe/tee the first script's output into the Python script? – Kelly Bundy Feb 18 '20 at 17:29
  • Does this answer your question? [Reading from a frequently updated file](https://stackoverflow.com/questions/5419888/reading-from-a-frequently-updated-file) – wwii Feb 18 '20 at 17:57
  • @HeapOverflow thank you! I didn't realize it could be as simple as iterating for each line of stdin and piping tcpdump, oops. `tcpdump -s 1500 port not 22 and port not 5672 | python publish.py` – vincebel Feb 18 '20 at 18:38
  • @wwii This certainly looks like it could work as well, but I was just overthinking it and a simple pipe was able to accomplish what I needed. – vincebel Feb 18 '20 at 18:45

0 Answers0