I've been searching for an answer to this question for around a month now, so any help is appreciated!
I am using an AVAudioEngine to record audio. This audio is recorded using a tap:
localInput?.installTap(onBus: 0, bufferSize: 4096, format: localInputFormat) {
It is recorded to type AVAudioPCMBuffer. It needs to be converted to type [UInt8]
I do so with this method:
func audioBufferToBytes(audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
let srcLeft = audioBuffer.floatChannelData![0]
let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)
// initialize bytes by 0
var audioByteArray = [UInt8](repeating: 0, count: numBytes)
srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
audioByteArray.withUnsafeMutableBufferPointer {
$0.baseAddress!.initialize(from: srcByteData, count: numBytes)
}
}
return audioByteArray
}
The audio is then written to the output stream. On another device the data needs to be converted back to AVAudioPCMBuffer so that it can be played. I use this method:
func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame
let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
let dstLeft = audioBuffer.floatChannelData![0]
buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
}
return audioBuffer
}
However, there must be something wrong with my logic because on the device, when I play the audio, I do hear something, but it just sounds like static.
Any help is appreciated, as I said, I've been stuck on this issue for a while now.
EDIT
Thanks for the help so far. I've switched to using Data. So my conversion looks like this (I found this code online):
func audioBufferToData(audioBuffer: AVAudioPCMBuffer) -> Data {
let channelCount = 1
let bufferLength = (audioBuffer.frameCapacity * audioBuffer.format.streamDescription.pointee.mBytesPerFrame)
let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: channelCount)
let data = Data(bytes: channels[0], count: Int(bufferLength))
return data
}
And the conversion back to AVAudioPCMBuffer looks like this:
func dataToAudioBuffer(data: Data) -> AVAudioPCMBuffer {
let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.count)/2)
audioBuffer.frameLength = audioBuffer.frameCapacity
for i in 0..<data.count/2 {
audioBuffer.floatChannelData?.pointee[i] = Float(Int16(data[i*2+1]) << 8 | Int16(data[i*2]))/Float(INT16_MAX)
}
return audioBuffer
}
Unfortunately, the same problem still exists...
EDIT 2
I've created a project that will simulate this issue. All it does is record audio, convert it to Data, convert it back to AVAudioPCMBuffer, and plays the audio.
Here is the link: https://github.com/Lkember/IntercomTest
EDIT 3
There was a crash when using a device with 2 channels, but I've fixed it.
EDIT 4
The submitted answer fixed the issue in my sample project, however it did not fix the issue in my main project. I've added a new question here: