0

I am using seek function to extract new lines in an updated file. My code looks like this:

read_data=open('path-to-myfile','r')
read_data.seek(0,2)
while True:
     time.sleep(sometime)
     new_data=read_data.readlines()
     do something with new_data

myfile is a csv file that will be constantly updated

The problem is that usually after several loops inside the while, new_data return nothing. It can be different loop numbers. While I checked myfile, it is still updating..... So any problem I have on my code ? Or is there any other way to do this ?

Any help appreciated !!

XXXXX
  • 3
  • 2

1 Answers1

0

You have two programs accessing the same file on disk? If that is the case, then the resource may be locking. I set up an example script that writes to a file, and another file that reads for changes based on the code you provided.

So in one instance of python:

import time
while True:
    time.sleep(2)
    with open('test.txt','a') as read_data:
        read_data.seek(0,2)
        read_data.write("bibbity boopity\n")

And in another instance of python

import time
read_data=open('test.txt','r')
read_data.seek(0,2)
while True:
     time.sleep(1)
     new_data=read_data.readlines()
     print(new_data)

In this case, the resource is updating slower than its being read, so changes printed by the bottom prog will be blank. But if I speed up the changes per second, well I still see them. But there are some instances where not all the updates are seen.

You may want to use asynchronous file reading to catch all the changes. Python 3 asyncio library doesn't support async file read/write, but curio does.

See also this question

moorsalin
  • 131
  • 2
  • 7