I have set up the basics of Multipeer Connectivity successfully and my two devices recognize and connect to each other as peers. However, I am having trouble streaming and playing audio using AVAudioPlayerNode
. I have a StartRecording()
function called when a button is toggled, and it should open and close the OutputStream.
My general understanding is that you need to create an AVAudioEngine
, attach an AVAudioPlayerNode
, connect the AVAudioPlayerNode
to the main mixer or output node (?). Then, from the sending device, start an outputStream
, and installTap()
on the Audio Engine's inputNode
, which will continuously send the bytes converted from AVAudioPCMBuffer
.
In the Multipeer functions, once stream data is received, I set the ViewController
's inputStream
which I think is handled using the StreamDelegate
functions. The incoming bytes should then be handled (if there is space). The code seems to be functional, but .hasBytesAvailable
is rarely called. I have microphone access enabled on both devices too.
Since it's hard to fit all the code in this post, I have created a public gist with relevant files. Here are the relevant parts from 3 main classes : ViewController.swift
, MultipeerHandler.swift
, StreamHelper.swift
:
In ViewController.swift
:
override func viewDidLoad() {
super.viewDidLoad()
configureAudioSession()
audioEngine = AVAudioEngine()
inputNode = audioEngine.inputNode
audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
multipeerSetup()
viewSetup()
}
private func configureAudioSession() {
do {
try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker)
try audioSession.setMode(.voiceChat)
try audioSession.setActive(true)
audioSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
DispatchQueue.main.async {
if allowed {
print("allowed")
}
}
}
} catch { }
}
@objc func startRecording() throws {
// start streaming
if !(self.isRecording) {
audioEngine.stop()
let inputFormat = inputNode.inputFormat(forBus: 0)
audioEngine.attach(audioPlayer)
audioEngine.connect(audioPlayer, to: mainMixer, format: audioFormat)
// audioEngine.connect(audioPlayer, to: audioPlayer.outputNode, format: inputFormat)
do {
if multipeerHandler.session.connectedPeers.count > 0 {
if outputStream != nil {
outputStream = nil
}
outputStream = try multipeerHandler.session.startStream(withName: "voice", toPeer: multipeerHandler.session.connectedPeers[0])
outputStream.schedule(in: RunLoop.main, forMode: .default)
outputStream.delegate = self
outputStream.open()
inputNode.installTap(onBus: 0, bufferSize: AVAudioFrameCount(inputFormat.sampleRate/10), format: inputFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
let convertedFrameCount = AVAudioFrameCount((Double(buffer.frameLength) / inputFormat.sampleRate) * inputFormat.sampleRate)
guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: convertedFrameCount) else {
print("cannot make pcm buffer")
return
}
print("\(#line)")
let bytes = StreamHelper.copyAudioBufferBytes(pcmBuffer)
if(self.outputStream.hasSpaceAvailable){
self.outputStream.write(bytes, maxLength: bytes.count)
print("\(#line)")
}
}
audioEngine.prepare()
try audioEngine.start()
} else {
print("no peers to connect to")
}
} catch let error {
print(error.localizedDescription)
}
self.isRecording = true
} else {
// stop streaming
inputNode.removeTap(onBus: 0)
self.isRecording = false
}
}
In MultipeerHelper.swift
:
func session(_ session: MCSession, didReceive stream: InputStream, withName streamName: String, fromPeer peerID: MCPeerID) {
print("didReceiveStream")
if streamName == "voice" {
print(#line)
viewController.inputStream = stream
viewController.inputStream.delegate = viewController
viewController.inputStream.schedule(in: RunLoop.main, forMode: .default)
viewController.inputStream.open()
}
}
Also, it is important to note that I have found a lot of similar posts but they're somewhat old. Here are some of the sources I used to understand what's going on:
- https://github.com/nullforlife/Multipeer-Voice-Chat/blob/master/Multipeer%20Voice%20Chat/ViewController.swift
- Playing an audio file repeatedly with AVAudioEngine
- Trouble hooking up AVAudioUnitEffect with AVAudioEngine
- AVAudioPlayerNode doesn't play sound
- AVAudioEngine inputNode's format changes when playing an AVAudioPlayerNode
Once again, I have posted a public gist for this inquiry. Any help would be greatly appreciated.