1

I'm fairly familiar with the python multiprocessing module, but I'm unsure of how to implement this setup. My project has this basic flow:

request serial device -> gather response -> parse response -> assert response -> repeat

It is right now a sequential operation that loops over this until it has gather the desired number of asserted responses. I was hoping to speed this task up by having a 'master process' do the first two operations, and then pass off the parsing and assertion task into a queue of worker processes. However, this is only beneficial if the master process is ALWAYS running. I'm guaranteed to be working on a multi-core machine.

Is there any way to have a process in the multiprocessing module always have focus / make run so I can achieve this?

Shawn
  • 47,241
  • 3
  • 26
  • 60
Philip Massey
  • 1,401
  • 3
  • 14
  • 24
  • I'm not sure what you are asking. You fire to processes which run forever and communicate with each other. Can you explain where's the issue? – freakish Aug 01 '13 at 16:05
  • If I understand how processes work correctly, when you create multiple processes their focus (from a processor perspectice) is arbitrary. IE if I recieved 4 responses and passed the parsing onto workers, there could be a time in which my 4-core processor is only working on the parsing jobs, not the serial communication process. I was hoping there was a way to assure that the serial communication process is always within processor focus. – Philip Massey Aug 01 '13 at 16:08
  • 1
    You cannot force a processor to focus on a given process (at least not under reasonable OS). But you misunderstand something: it's not that processor will focus on these 4 processes until they finish the job. Context switching takes place in the middle of a job (that's why you can open for example 10 movie players and all of them work at the same time). So you should not worry about that, your server won't get blocked. – freakish Aug 01 '13 at 16:22
  • I understand that, I must have misphrased my question. But I suppose it is out of reason to block context switching for one task. Thanks! – Philip Massey Aug 01 '13 at 16:26
  • No worries. Maybe it's me who doesn't understand the problem. If you want you can rephrase it and I will think about it one more time. Cheers! P.S. If you want the other process to wait for the first one you can simply use locks. – freakish Aug 01 '13 at 16:28
  • Is this a hard real-time application? Do you have stringent limits on how long it takes to handle the serial communications before you start losing data? – Jonathan Aug 01 '13 at 17:26

2 Answers2

1

From what I can gather (assuming that you don't have a stringent requirement that the master is always logging the data from the serial device) you just want the master to be ready to give any worker a chunk of data and be ready to receive data from any worker as soon as the worj=ker is ready.

to acheive this use two queus and multiprocessing

Multiprocessing Queue in Python

How to use multiprocessing queue in Python?

this should be sufficient fro your needs if time(parse data)>>gather data

Community
  • 1
  • 1
staticd
  • 1,194
  • 9
  • 13
1

Here's one way to implement your workflow:

  1. Have two multiprocessing.Queue objects: tasks_queue and results_queue. The tasks_queue will hold device outputs, and results_queue will hold results of the assertions.

  2. Have a pool of workers, where each worker pulls device output from tasks_queue, parses it, asserts, and puts the result of assertion on the results_queue.

  3. Have another process continuously polling device and put device output on the tasks_queue.

  4. Have one last process continuously polling results_queue, and ending the overall program when the desired number of resuts (successful assertions) is reached.

Total number of processes (multiprocessing.Process objects) is 2 + k, where k is the number of workers in the pool.

Velimir Mlaker
  • 10,664
  • 4
  • 46
  • 58