0

I am trying to pass sound data to a subprocess in python through shared_memory. Currently, in one program I am converting the sound byte data to a numpy array of int16. I can access the shared_memory of the numpy array from both python processes but the conversion of the numpy array back to a bytearray takes too long for what I am trying to do. Is there a way to just pass the byte array to a python subprocess (through shared_memory or something else)?

The python example I code I based my code off of is:

>>> # In the first Python interactive shell
>>> import numpy as np
>>> a = np.array([1, 1, 2, 3, 5, 8])  # Start with an existing NumPy array
>>> from multiprocessing import shared_memory
>>> shm = shared_memory.SharedMemory(create=True, size=a.nbytes)
>>> # Now create a NumPy array backed by shared memory
>>> b = np.ndarray(a.shape, dtype=a.dtype, buffer=shm.buf)
>>> b[:] = a[:]  # Copy the original data into shared memory
>>> b
array([1, 1, 2, 3, 5, 8])
>>> type(b)
<class 'numpy.ndarray'>
>>> type(a)
<class 'numpy.ndarray'>
>>> shm.name  # We did not specify a name so one was chosen for us
'psm_21467_46075'

>>> # In either the same shell or a new Python shell on the same machine
>>> import numpy as np
>>> from multiprocessing import shared_memory
>>> # Attach to the existing shared memory block
>>> existing_shm = shared_memory.SharedMemory(name='psm_21467_46075')
>>> # Note that a.shape is (6,) and a.dtype is np.int64 in this example
>>> c = np.ndarray((6,), dtype=np.int64, buffer=existing_shm.buf)
>>> c
array([1, 1, 2, 3, 5, 8])
>>> c[-1] = 888
>>> c
array([  1,   1,   2,   3,   5, 888])

>>> # Back in the first Python interactive shell, b reflects this change
>>> b
array([  1,   1,   2,   3,   5, 888])

>>> # Clean up from within the second Python shell
>>> del c  # Unnecessary; merely emphasizing the array is no longer used
>>> existing_shm.close()

>>> # Clean up from within the first Python shell
>>> del b  # Unnecessary; merely emphasizing the array is no longer used
>>> shm.close()
>>> shm.unlink()  # Free and release the shared memory block at the very end

The data is saved in shared_memory in a int16 numpy array (c)

To input the sound data into pyaudio.stream.write I have to do the following conversion

>>> c = np.array([  1,   1,   2,   3,   5, 888])
>>> c
array([  1,   1,   2,   3,   5, 888])
>>> bytedata = b''.join(c)
>>> bytedata
b'\x01\x00\x00\x00\x01\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x05\x00\x00\x00x\x03\x00\x00'
>>>

Is it possible to have this bytes format stored in a shared_memory location? Ideally a working version of:

store_byte_array = np.bytearray(c, dtype=np.int16,buffer=shared_memory.buf)

Thanks in advance!

Cnicho35
  • 21
  • 1
  • Why do you feel the need to use numpy? – rici Jun 08 '20 at 02:30
  • You can try using the `mmap` module if you insist on using shared memory or you can use a pipe/socket. – Mia Jun 08 '20 at 03:12
  • Does this answer your question? [Use numpy array in shared memory for multiprocessing](https://stackoverflow.com/questions/7894791/use-numpy-array-in-shared-memory-for-multiprocessing) – Joe Jun 08 '20 at 05:35
  • Take a look at `np.frombuffer`. – Joe Jun 08 '20 at 05:35
  • https://github.com/numpy/numpy/issues/15646 – Joe Jun 08 '20 at 05:37
  • https://stackoverflow.com/questions/31171277/sharing-contiguous-numpy-arrays-between-processes-in-python – Joe Jun 08 '20 at 05:37
  • https://stackoverflow.com/questions/23000245/forming-numpy-array-from-array-buffer-from-shared-memory-multiprocessing-fails – Joe Jun 08 '20 at 05:37
  • https://stackoverflow.com/questions/46811709/large-numpy-arrays-in-shared-memory-for-multiprocessing-is-something-wrong-with – Joe Jun 08 '20 at 05:38

0 Answers0