-1

Example:

I have installed a sensor in the car, that is sending the data continuously, Now, I have to process(fusion) the continuous data coming from the sensor but at the same while process will be finishing its execution, data will also be coming so, how to store the data that is coming while process is taking time for execution for future?.

    sample code:

    buffer1=[]
    buffer2=[]

    def process_function(buffer):
        //processing

    while true:
        //data receiving continously
        buffer1.append(data)
        if len(buffer1)>0: process(buffer1)
        buffer2.append(data)

(while the process_function will take buffer1 to process, at the same time, the continuous data should be stored in buffer2 so that after finishing the process_function with buffer1 can process with buffer2 and repeat.)
stovfl
  • 14,998
  • 7
  • 24
  • 51
  • my suggestion you can use kafka streaming tool. you don't need to manage your data with buffer. all you need kafka will handle – Beyhan Gul Nov 29 '19 at 07:20
  • We’re going to need far more information than this. – AMC Nov 29 '19 at 07:21
  • @AlexanderCécile Please tell what you need. – prabhuiitdhn Nov 29 '19 at 07:33
  • @prabhuiitdhn Information about the sensor, for one... – AMC Nov 29 '19 at 07:35
  • @AlexanderCécile Thanks Alexander, but I don't think it is mandatory to know the information about sensor, Just understand like some random numbers are coming. – prabhuiitdhn Nov 29 '19 at 08:04
  • @AlexanderCécile How can I use multithreading concept, where the process_function will execute but the at same time buffer2 will add the coming data ? – prabhuiitdhn Nov 29 '19 at 08:05
  • 1
    As suggested by @beyhan why don't you use Kafka.. it's very straight forward.. Have a process which keep waiting for sensor information via Kafka.. and produce message to a Kafka queue from the sensor.. If in future u have multi-threaded processes consuming from the same queue Kafka itself will manage.. – Akash Sundaresh Nov 29 '19 at 08:21
  • @prabhuiitdhn Read [using-asyncio-queue-for-producer-consumer-flow](https://stackoverflow.com/questions/52582685/using-asyncio-queue-for-producer-consumer-flow) and [producer-consumer-problem-with-python-multiprocessing](https://stackoverflow.com/questions/914821/producer-consumer-problem-with-python-multiprocessing) – stovfl Nov 29 '19 at 09:23

1 Answers1

1

You could use a multiprocessing Queue and two processes. One for the producer and one for the consumer:

from multiprocessing import Process, Queue

def collection_sensor_values(mp_queue):
    fake_value = 0
    while True:
        mp_queue.put(f"SENSOR_DATA_{fake_value}")
        fake_value += 1
        time.sleep(2)

def process_function(mp_queue):
    while True:
        sensor_reading = mp_queue.get(block=True)
        print(f"Received sensor reading: {sensor_reading}")

q = Queue()
sensor_collector_process = Process(target=collection_sensor_values, args=(q,))
readings_process = Process(target=process_function, args=(q,))
all_procs = [sensor_collector_process, readings_process]

for p in all_procs:
    p.start()

for p in all_procs:
    # run until either process stops
    if p.is_alive():
        p.join()

for p in all_procs:
    if p.is_alive():
        p.terminate()
caffeinate_me
  • 146
  • 1
  • 4