0

I have streaming data from many sensors which are updated every second to the .temp file on computer. I am trying to find way to read this data sequentially as it arrives and feed it to my function which should perform computations on this streaming data.

Is there any way to read this kind of data from .tmp file and perform computations at the same instance when data arrives?

Bhakti
  • 21
  • 4

1 Answers1

0

Maybe something like this will help i created two python files one reader and one writer:

For example my writer will add a json string with the key age every second to a textfile:

import random
import time
with open("test.txt", "a") as t:
    while True:
        time.sleep(1)
        t.write('{"age": ' + str(random.randint(1, 100)) + '}\n')
        t.flush()

The reader will now read the latest writen line on change and calculate the median of this data.

import json
import statistics

agearray = []

with open("test.txt", "rb") as t:
    current_filesize = t.seek(0, 2)
    while True:
        new_filesize = t.seek(0, 2)
        if new_filesize > current_filesize:
            print("file changed")
            print(new_filesize, current_filesize)
            t.seek(current_filesize)
            readsize = new_filesize - current_filesize
            data = t.read(readsize)
            myjson = json.loads(data.decode("utf-8"))
            print(myjson)
            agearray.append(myjson["age"])
            print(statistics.median(agearray))
            current_filesize = new_filesize

Jep this isn't the best example but this would be my approach.
You have to start the files in two different threads for example 2x cmd or git bash...

Fabian
  • 1,130
  • 9
  • 25