I have a socket server that sends microphone data over UDP using PyAudio. Code looks something like this:
stream = pa.open(
format=pyaudio.paInt16,
channels=1,
rate=44100,
output=False,
input=True,
input_device_index=0,
frames_per_buffer=4096,
stream_callback=callback
)
def callback(in_data, frame_count, time_info, status):
server_socket.sendto(in_data, udp_client)
Using Python, it's easy to make client that plays audio in real time:
stream = pa.open(
format=pyaudio.paInt16,
channels=1,
rate=44100,
frames_per_buffer=chunk,
output=True,
input=False
)
d = client_socket.recvfrom(chunk)
stream.write(d, chunk)
Even handling this bytearray in Python is extremely easy, with this oneline:
np.frombuffer(in_data, dtype=np.int16)
Yet I have no idea how to handle this streaming in Swift 5. Now, I tried to do the same in Swift. It looks something like this:
var stream_buffer = NSMutableData()
self.connection?.receiveMessage { (data, context, isComplete, error) in
self.stream_buffer.append(data!)
}
And once stream_buffer has acquired enough data...
let format = AVAudioFormat(
commonFormat: AVAudioCommonFormat.pcmFormatInt16,
sampleRate: 44100,
channels: 1,
interleaved: true
)
let buffer_data = stream_buffer as Data
let buffer = buffer_data.toPCMBuffer(format: format!)
toPCMBuffer here
Yet, I have no idea if I'm doing bytearray conversion correct, or whether AVPlayer even supports Int16 bytearrays. If not, how would I convert my stream_buffer
to Float32? Has anyone successfully streamed PCM audio to iOS?