I am running a Linux command line script 24-hours a day and this outputs data to a .csv file - approximately 5 lines per minute. As a result of the fact this script is continually running, the .csv file is continually open. The command line script only allows output to .csv as opposed to a db.
What I also now wish to do is write a Python script that will analyse the content of the .csv file in real time so that, as soon as a new line of data is written to the .csv file, it is analysed to see if any content matches certain conditions.
Any suggestions on how an already open .csv file can be monitored in realtime with Python?