Currently I am writing on a program with 2 separate file Main.py and App.py, Main.py takes in readings such as distance and temperature and writes it in a file named data.txt and App.py then reads from the text file.
#main.py
def scan():
global object_temp, close_enough, real_distance #close enough writes true is someone is near
while True:
f=open("data.txt","a+")
a=str(scan())+"\n"
f.write(a):
log.log(a)
f.close()
#data.txt
#imagine this but with almost 60000 line each time I run it as it's printing every second
[26.03, 30.91, 126.5, False]
[25.97, 30.89, 125.69, False]
[25.97, 30.89, 124.74, False]
.
.
etc
#app.py
def getValues():
global prevValues
f=open("data.txt","r")
latestValue = str(f.read())
f.close()
#log.log(latestValue,"LATEST VALUES")
latestValue = latestValue.split("\n")[-2].strip('][').split(', ')
log.log(latestValue,"LATEST VALUES")
if latestValue=="":
return(prevValues)
else:
return(latestValue)
prevValues=latestValue
The problem now is that the text file gets flooded with readings and over time will slow down the program and I do know it is not the most efficient way to go about doing this but I am just getting into python, so is there anyway to transfer the data directly from Main.py to App.py or a way to delete the text file reading after reaching a certain number of lines? Example after 50 lines it starts deleting/overwriting the lines?