1

I am running a Linux command line script 24-hours a day and this outputs data to a .csv file - approximately 5 lines per minute. As a result of the fact this script is continually running, the .csv file is continually open. The command line script only allows output to .csv as opposed to a db.

What I also now wish to do is write a Python script that will analyse the content of the .csv file in real time so that, as soon as a new line of data is written to the .csv file, it is analysed to see if any content matches certain conditions.

Any suggestions on how an already open .csv file can be monitored in realtime with Python?

thefragileomen
  • 1,537
  • 8
  • 24
  • 40
  • 1
    You can analyze the line at the same time it's written. Or close the file every 5 minutes. Or you could use a database because there's no reason why you can't – OneCricketeer May 13 '17 at 21:16
  • Possible duplicate of [How do I watch a file for changes?](http://stackoverflow.com/questions/182197/how-do-i-watch-a-file-for-changes) – user2390182 May 13 '17 at 21:16
  • I'm with @cricket_007 on this one. If you've never looked at sqlite, it could solve the requirements here: https://docs.python.org/2/library/sqlite3.html – David Metcalfe May 13 '17 at 21:21
  • @DavidMetcalfe: sqlite3 is a very nice tool, but it is not really good for concurrent accesses... – Serge Ballesta May 13 '17 at 21:36
  • There is no portable way to trigger an action when a file is modified, so you have to either read it regularly to see if something has beed appended or use OS specific methods. As you do not say what you want exactly nor show what you have tried, your question is currently *unclear*... – Serge Ballesta May 13 '17 at 21:39

0 Answers0