2

I've been searching for an answer to this question for around a month now, so any help is appreciated!

I am using an AVAudioEngine to record audio. This audio is recorded using a tap:

localInput?.installTap(onBus: 0, bufferSize: 4096, format: localInputFormat) {

It is recorded to type AVAudioPCMBuffer. It needs to be converted to type [UInt8]

I do so with this method:

func audioBufferToBytes(audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)
    
    // initialize bytes by 0 
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)
    
    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            $0.baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }
    
    return audioByteArray
}

The audio is then written to the output stream. On another device the data needs to be converted back to AVAudioPCMBuffer so that it can be played. I use this method:

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame
    
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength
    
    let dstLeft = audioBuffer.floatChannelData![0]
    
    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }
    
    return audioBuffer
}

However, there must be something wrong with my logic because on the device, when I play the audio, I do hear something, but it just sounds like static.

Any help is appreciated, as I said, I've been stuck on this issue for a while now.


EDIT

Thanks for the help so far. I've switched to using Data. So my conversion looks like this (I found this code online):

func audioBufferToData(audioBuffer: AVAudioPCMBuffer) -> Data {
    let channelCount = 1
    let bufferLength = (audioBuffer.frameCapacity * audioBuffer.format.streamDescription.pointee.mBytesPerFrame)
    
    let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: channelCount)
    let data = Data(bytes: channels[0], count: Int(bufferLength))

    return data
}

And the conversion back to AVAudioPCMBuffer looks like this:

func dataToAudioBuffer(data: Data) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.count)/2)
    audioBuffer.frameLength = audioBuffer.frameCapacity
    for i in 0..<data.count/2 {
        audioBuffer.floatChannelData?.pointee[i] = Float(Int16(data[i*2+1]) << 8 | Int16(data[i*2]))/Float(INT16_MAX)
    }
    
    return audioBuffer
}

Unfortunately, the same problem still exists...


EDIT 2

I've created a project that will simulate this issue. All it does is record audio, convert it to Data, convert it back to AVAudioPCMBuffer, and plays the audio.

Here is the link: https://github.com/Lkember/IntercomTest


EDIT 3

There was a crash when using a device with 2 channels, but I've fixed it.


EDIT 4

The submitted answer fixed the issue in my sample project, however it did not fix the issue in my main project. I've added a new question here:

How to send NSData over an OutputStream

Community
  • 1
  • 1
Kember
  • 63
  • 9

3 Answers3

2

Disclaimer: Okay, this is based solely on the theory from the apple docs - I did not do that before, neither is your code informative enough, as to understand the whole thing you are trying to accomplish.

First of all, you are trying to convert the .floatChannelData to Uint8 which, according to the docsets

Creates a new instance by rounding the given floating-point value toward zero.

This would result in an array filled with probably wrong or worse, empty values (empty, as in zero).

In my understanding, .withMemoryRebound will NOT let you access a floating point number as a Integer. The implicit conversion will cut the numbers and therefore should distort your result. This is not what you want.

Instead, you should use Audio Converter Services (documentation)

To convert your floating point audioBuffer securely and lossless to an integer audioBuffer.

I think this should point you in the right direction. You should also check the format of your AVAudioPCMBuffer before starting the conversion. The handling could be case dependent.

I hope I could help.

Maurice
  • 1,466
  • 1
  • 13
  • 33
  • Thanks for your reply. I will definitely look into the link you sent me and see what I can figure out. Otherwise, if you want more information let me know. Or I could link you to my github? – Kember Mar 15 '17 at 21:22
  • Ya, send me the GitHub, I may figure something out for you. I sadly don't use swift but objective-c myself, but I guess I can cope with that. – Maurice Mar 15 '17 at 21:25
  • Thank you for taking a look, here is the link: https://github.com/Lkember/MotoIntercom – Kember Mar 15 '17 at 21:30
  • @Kember I don't want to search through the whole project, what file are we talking about here? – Maurice Mar 15 '17 at 21:46
  • Sorry, it's in PhoneViewController – Kember Mar 15 '17 at 21:56
  • http://stackoverflow.com/questions/28048568/convert-avaudiopcmbuffer-to-nsdata-and-back You should have a look at this question. Maybe you can use NSData instead. Looking at your code I assume it is what I suggested: a conversion error. Can not test right now though - sorry – Maurice Mar 15 '17 at 22:07
  • I tried using NSData originally but to write to an OutputStream it requires "UnsafePointer". – Kember Mar 15 '17 at 22:16
  • https://developer.apple.com/library/content/documentation/Swift/Conceptual/BuildingCocoaApps/InteractingWithCAPIs.html please read this carefully. You should be able to convert the nsdata to an UTF8 string and write that to the outputStream regardless. – Maurice Mar 15 '17 at 22:26
  • That's a pretty big project - can you reproduce the problem in a simpler app? One that doesn't need to search for peers? – Rhythmic Fistman Mar 16 '17 at 12:32
  • @RhythmicFistman Sorry I've taken so long to create it, but I've added the link in the post. All you have to do is press tart and it will record audio, convert that audio to Data, convert it back to AVAudioPCMBuffer and play it. – Kember Mar 20 '17 at 22:01
1

Here you go:

func audioBufferToNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData {
    let channelCount = 1  // given PCMBuffer channel count is 1
    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount)
    let data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameCapacity * PCMBuffer.format.streamDescription.pointee.mBytesPerFrame))
    return data
}

func dataToAudioBuffer(data: NSData) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)
    let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.pointee.mBytesPerFrame)
    audioBuffer.frameLength = audioBuffer.frameCapacity
    let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: Int(audioBuffer.format.channelCount))
    data.getBytes(UnsafeMutableRawPointer(channels[0]) , length: data.length)
    return audioBuffer
}
Logan
  • 1,047
  • 10
  • 33
  • This didn't completely solve my problem (it did in the sample project) but the problem still persists on my main project. – Kember Mar 21 '17 at 19:35
  • Hi hope you help me... have you any sample for streaming mic voice using multipeer connectivity – Saurabh Jain Jul 25 '17 at 10:53
  • @saurabh I don’t have a sample, but I’ve done it in the past. Just record audio using an AVAudioEngine object and convert the buffer to NSData using the formula above, then stream that NSData. Then on the other device convert it back to a buffer also using the function above and then play that buffer using an AVAudioPlayer. – Logan Jul 26 '17 at 13:46
  • @Logan is there any delay between sending and receiving? – Saurabh Jain Jul 26 '17 at 14:34
  • @saurabh At first the delay was about half a second, but since then I’ve got the delay down to be hardly noticeable. – Logan Jul 26 '17 at 14:35
  • @Logan I am use the multipeer connectivity for streaming audio but my audio received another side, I have found many class and answer but I am unable to send mic voice, I am struggling over a week but not find any solution.. Please help me – Saurabh Jain Jul 26 '17 at 14:40
  • @SaurabhJain What do you have so far? Are you recording using an AVAudioEngine? – Logan Jul 26 '17 at 14:52
  • @Logan I am found here https://stackoverflow.com/questions/26270127/ios8-avaudioengine-how-to-send-microphone-data-over-multipeer-connectivity But the installTapOnBus block call very fast so that the data buffer not send to other side. Please provide me any solution or reference. Right now I am using https://github.com/tonyd256/TDAudioStreamer this class to record and send the audio. But I want audio streaming just like calling.. I have also ask question regarding this please check https://stackoverflow.com/questions/45272471/voice-over-bluetooth-in-ios – Saurabh Jain Jul 26 '17 at 15:01
  • @SaurabhJain I'm not sure I understand what you mean when you say the "installTapOnBus block call very fast so that data buffer not send to other side"... I'm not understanding this... Basically in your installTapOnBus, you need to convert the buffer to NSData using the above method, and then write to the OutputStream. – Logan Jul 26 '17 at 17:45
0

Check out https://www.iis.fraunhofer.de/en/ff/amm/dl/whitepapers.html Using the info here I did something very similar. There's a detailed PDF and some sample code to get you started.

Dave Paul
  • 1,341
  • 1
  • 9
  • 7