2

I'm working on a project involving using a Beaglebone to read from multiple sensors and then pass that data into a text file.I'm monitoring six different muscles, so I have the files correspond to each.

Right now, there are 6 sensors feeding into the Beaglebone Black's ADC pins and my code instructs the Beaglebone to create 1 separate file for each pin. Whenever I run the below code, I only get the one file (the first function runs). Previously I had not included the "while True:" statements which resulted in 1-2 readings and six files created. I added the "while True:" to continuously record the sensors' data because that's what I thought was responsible for my not having more than 2 points in each file.

My question is: is it at all possible to write to multiple files at the same time? I could alternatively write all this data to the same file, but I'm interested in knowing what it is that makes this code not function as intended (6 files.)

        #File save for left hamstring
def LeftHamAcquisition():
        HamLData = open('HamLeft' + '.txt', 'a+')
        file_name = os.path.join(/relevant file path/)
        while True:
                EMGhamL = ADC.read_raw('AIN1')
                HamLData.write(str(elapsed_milliseconds))
                HamLData.write('\t')
                HamLData.write(str(EMGhamL))
                HamLData.write('\n')

        #File save for right hams
def RighHamAcquisition():
        HamRData = open('HamRight' + '.txt', 'a+')
        file_name2 = os.path.join(/relevant file path/)
        while True:
                EMGhamR = ADC.read_raw('AIN2')
                HamRData.write(str(elapsed_milliseconds))
                HamRData.write('\t')
                HamRData.write(str(EMGhamR))
                HamRData.write('\n')


        #file save for left quad
def LeftQuadAcquisition():        
        QuadLData = open('QuadLeft' + '.txt', 'a+')
        file_name3 = os.path.join(/relevant file path/)
        while True:
                EMGquadL = ADC.read_raw('AIN3')
                QuadLData.write(str(elapsed_milliseconds))
                QuadLData.write('\t')
                QuadLData.write(str(EMGquadL))
                QuadLData.write('\n')

        #file save for right quad
def RightQuadAcquisition():
        QuadRData = open('QuadRight' + '.txt', 'a+')
        file_name4 = os.path.join(/relevant file path/)
        while True:
                EMGquadR = ADC.read_raw('AIN4')
                QuadRData.write(str(elapsed_milliseconds))
                QuadRData.write('\t')
                QuadRData.write(str(EMGquadR))
                QuadRData.write('\n')

        #file save for left vast
def LeftVastAcquisition():
        VastLData = open('VastLeft' + '.txt', 'a+')
        file_name5 = os.path.join(/relevant file path/)
        while True:
                EMGVastL = ADC.read_raw('AIN5')
                VastLData.write(str(elapsed_milliseconds))
                VastLData.write('\t')
                VastLData.write(str(EMGVastL))
                VastLData.write('\n')

        #file save for right vast
def RightVastAcquisition():
        VastRData = open('VastRight' + '.txt', 'a+')
        file_name6 = os.path.join(/relevant file path/)
        while True:
                EMGVastR = ADC.read_raw('AIN6')
                VastRData.write(str(elapsed_milliseconds))
                VastRData.write('\t')
                VastRData.write(str(EMGVastR))
                VastRData.write('\n')

#The Program
print "Press ctrl-C to end acquisition"

LeftHamAcquisition()
RighHamAcquisition()
LeftVastAcquisition()
RightVastAcquisition()
LeftQuadAcquisition()
RightQuadAcquisition()

try:
    pass
except KeyboardInterrupt:      
    raise data.close()
skrrgwasme
  • 9,358
  • 11
  • 54
  • 84
boktor
  • 35
  • 1
  • 6
  • 1
    What made you think they would all execute? `LeftHamAcquisition` has an infinite loop, so the function never returns. You need to use something like the multiprocessing library to gen them to run in parallel. – skrrgwasme Apr 14 '16 at 21:28
  • As the other guy pointed out, since each function contains an infinite loop, none of them will ever return, so your program will get stuck in whatever one gets called first. Continuously writing to memory usually isn't the best option though, since it's generally an expensive operation. Why not have each of the functions send the data to a buffer, then have a separate thread write the buffer. You'll almost definitely need a separate thread to do the writing, unless you sequentially check each sensor. – Carcigenicate Apr 14 '16 at 21:39

1 Answers1

1

Your function calls have infinite loops in them, so they will never return. One file was created by LeftHamAcquisition, but since it never returned, none of the other functions were ever executed. You need to use something like the multiprocessing module to get them to run in parallel. In particular, I would recommend multiprocessing pools and the apply_async function:

import multiprocessing
import Queue
import time

# one global constant: the poison pill
# this could really be whatever you want - my string choice is arbitrary
STOP = "stop"

# change the signature of your function to accept a queue for the main 
# process to pass a poison pill
def LeftHamAcquisition(kill_queue):
    f_name = 'HamLeft.txt'

    # you aren't doing anything with "file_name" - should it be removed?
    # file_name = os.path.join(/relevant file path/)

    # use file context managers:
    with open(fname, 'a+') as HamLData:
        while True:

            # in the infinite loop, we add a check for the poison pill
            try:
                val = kill_queue.get(block=False)
                if val = STOP:
                    return # leave if the poison pill was sent
            except Queue.Empty:
                pass # ignore empty queue

            EMGhamL = ADC.read_raw('AIN1')
            HamLData.write(str(elapsed_milliseconds))
            HamLData.write('\t')
            HamLData.write(str(EMGhamL))
            HamLData.write('\n')

# ... the rest of your functions ...

#The Program
print "Press ctrl-C to end acquisition"

# a list of your functions
f_list = [
    LeftHamAcquisition,
    RighHamAcquisition,
    LeftVastAcquisition,
    RightVastAcquisition,
    LeftQuadAcquisition,
    RightQuadAcquisition,
]

pool = multiprocessing.Pool()    #c reate the worker pool
kill_queue = multiprocessing.Queue() # create the queue to pass poison pills

for f in f_list:
    # kick off the functions, passing them the poison pill queue
    pool.apply_async(f, args=(kill_queue))     
try:
    # put the main process to sleep while the workers do their thing
    while True:
        time.sleep(60)

except KeyboardInterrupt:      

    # close the workers nicely - put one poison pill on the queue for each
    for f in f_list:
        q.put(STOP)
    pool.close()
    pool.join()
    raise data.close()

Also, there's no reason to have this many functions. They all do the same thing, just with different strings and variable names. You should refactor them into one function that you can pass arguments to:

def acquisition(kill_queue, f_name, ain):

    with open(fname, 'a+') as f:
        while True:

            try:
                val = kill_queue.get(block=False)
                if val = STOP:
                    return
            except Queue.Empty:
                pass

            an_val = ADC.read_raw(ain)

            # where does "elapsed_milliseconds" come from? it's undefined
            # in your example code
            f.write("{}\t{}\n".format(elapsed_milliseconds, an_val))

With this function, instead of providing a list of individual functions in my multiprocessing example, you can just reuse this function call it repeatedly with different arguments (which is the whole point of functions).

skrrgwasme
  • 9,358
  • 11
  • 54
  • 84
  • Wow! This really cleared things up. As you probably, guessed I'm new to Python. I did know about the infinite loops but I didn't know that the Beagle couldn't do multiple loops simultaneously out of the gate. *I do have one question though, how would one input the arguments of the general function you created in your second example? Is `kill_queue` simply set to be equal to `val?` – boktor Apr 15 '16 at 03:49
  • @boktor Both of your questions are answered in the [multiprocessing docs](https://docs.python.org/2/library/multiprocessing.html). Read over the `apply_async` and `queue.get` sections, and ping me here again if you still don't understand. – skrrgwasme Apr 15 '16 at 15:37
  • 1
    From what I've been able to gather it seems that `kill_queue` is your stand in for allowing the user to be able to input the 'poison pill' or `STOP` string? – boktor Apr 15 '16 at 20:04
  • @boktor Basically, yes. It propagates the "stop working" message to the workers. `kill_queue` is a connection between your main process and the workers that allows messages to be passed back and forth. When the user hits CTRL+C, instead of just `pass`ing the exception, we pass a poison pill to every worker so they close the files and exit cleanly. Then we close the worker pool and exit. Without doing this, the workers would be killed when the main process exits, but they wouldn't exit cleanly, and there may be data left in their output buffers that won't get written to files (it is lost). – skrrgwasme Apr 15 '16 at 20:11
  • @boktor Just to be very clear - CTRL+C *will* close the workers, because they will be killed when the main process exits as a result of the CTRL+C. The message passing is only necessary to make sure it happens *cleanly*, preventing data loss. – skrrgwasme Apr 15 '16 at 20:14
  • @boktor By the way, I just noticed a bug I had - don't forget to call `pool.join()` after `pool.close()`. – skrrgwasme Apr 15 '16 at 20:18
  • I hate to bother you again, but it seems I misunderstood what I was trying to confirm with my question a while ago. I couldn't tell what `kill_queue` was defined as in your example. I couldn't find anything in the multiprocessing docs relating to it or resembling your usage so I thought it was an arbitrarily defined function (like raw_input(), which doesn't work-- `str`s have no get attribute) – boktor Apr 23 '16 at 22:44
  • `kill_queue` is an arbitrary variable name I chose. I could have called it `slurpee`. I've edited the code to hopefully make this a bit more clear. I create the queue for passing poison pills in the main process with `kill_queue = multiprocessing.Queue()`, then I pass it to the functions executing in the suprocesses when I start them with this: `pool.apply_async(f, args=(kill_queue))`. Since I modified your functions to accept an argument (`def LeftHamAcquisition(kill_queue):`), they now get the queue as their only argument. `kill_queue` was my own chosen descriptive name for the queue. – skrrgwasme Apr 24 '16 at 21:08
  • Well, now the code works, but I'm at square one again. I'm just getting 1 file Instead of the 6 I was aiming for. Either method (1 universal function w/ 3 args vs. individual functions for each group) returns just 1 file, which is the one at the beginning of the queue. – boktor Apr 24 '16 at 22:28
  • I would open a new question with your updated code then. I suspect your subprocesses aren't being launched correctly, but I can't debug it without seeing the code, and that's too much to do in the comments here. But this question is complete and adding the new code would change it, so consider opening a new question, and include the new multiprocessing code you're using. Preferably the code with only function, because that's the way you *should* be handling it. – skrrgwasme Apr 24 '16 at 22:33
  • I have opened a new question with the general function definition – boktor Apr 24 '16 at 22:51