9

I have created a video chat app for groups in iOS. I have been searching for some ways to control the audio volume for different participant separately. I found way to mute and unmute using isPlaybackEnabled in RemoteAudioTrack, but not to control volume.

I also thought if we can use it in AVAudioPlayer. I found addSink. This is what I tried from here:

class Audio: NSObject, AudioSink {
    var a = 1
    func renderSample(_ audioSample: CMSampleBuffer!) {
        print("audio found", a)
        a += 1

        var audioBufferList = AudioBufferList()
        var data = Data()
        var blockBuffer : CMBlockBuffer?

        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioSample, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout<AudioBufferList>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &blockBuffer)
        let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))

        for audioBuffer in buffers {
            let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
            data.append(frame!, count: Int(audioBuffer.mDataByteSize))
        }

        let player = try! AVAudioPlayer(data: data) //crash here
        player.play()
    }
}

But It crashed on let player = try! AVAudioPlayer(data: data).


EDIT:
This is the error: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=NSOSStatusErrorDomain Code=-39 "(null)": file.

This is data so I guess it is not converted:

▿ 0 bytes
  - count : 0
  ▿ pointer : 0x000000016d7ae160
    - pointerValue : 6131736928
  - bytes : 0 elements

And this is the audioSample:

<CMAudioFormatDescription 0x2815a3de0 [0x1bb2ef830]> {
    mediaType:'soun' 
    mediaSubType:'lpcm' 
    mediaSpecific: {
        ASBD: {
            mSampleRate: 16000.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     } 
        cookie: {(null)} 
        ACL: {(null)}
        FormatList Array: {(null)} 
    } 
    extensions: {(null)}
}
Daniel
  • 3,188
  • 14
  • 34
Alok Subedi
  • 1,601
  • 14
  • 26
  • 2
    First off, what is your error? Crashing can mean 10000000 things. – impression7vx Jul 24 '19 at 18:12
  • 2
    [This](https://thoughtbot.com/blog/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity) could help. You probably want to use [Audio Queue Services](https://developer.apple.com/documentation/audiotoolbox/audio_queue_services). I have a hunch creating a new `AVAudioPlayer` for each little packet might cause a lot of choppiness. – Daniel Jul 29 '19 at 18:04
  • How are you initializing and getting the data for your CMSampleBuffer? What package are you using? I will add an answer showcasing how you can do this with Audio Queue Services in a bit. – Rakeeb Hossain Jul 30 '19 at 15:22
  • @RakeebHossain I am using twilio. So twilio sends audio as `RemoteAudioTrack `, which is played automatically. For more control they have a protocol `AudioSink` which have one function that gives `CMSampleBuffer` as in Audio class above. – Alok Subedi Jul 31 '19 at 03:31

2 Answers2

4

You can get the full data buffer from CMSampleBuffer and convert it to Data:

let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
let blockBufferDataLength = CMBlockBufferGetDataLength(blockBuffer!)
var blockBufferData  = [UInt8](repeating: 0, count: blockBufferDataLength)
let status = CMBlockBufferCopyDataBytes(blockBuffer!, atOffset: 0, dataLength: blockBufferDataLength, destination: &blockBufferData)
guard status == noErr else { return }
let data = Data(bytes: blockBufferData, count: blockBufferDataLength)

And also refer to AVAudioPlayer overview:

Use this class for audio playback unless you are playing audio captured from a network stream or require very low I/O latency.

So I don't think it will work for you. You should better use AVAudioEngine or Audio Queue Services.

Pavel Kozlov
  • 993
  • 6
  • 16
  • 1
    Got the data. I will try `Audio Queue Services` and update. Thank you – Alok Subedi Aug 06 '19 at 04:34
  • Hey Alok. Did you ever get this figured out? I'm trying to accomplish pretty much the same thing. Using Twilio and trying to get an onscreen volume control to work. Have spent hours working with MPVolumeView which was a dead end. Trying to figure out now how to feed the CMSampleBuffer into AVAudioPlayer so that MPVolumeView might function as expected. – jones_corey Apr 02 '20 at 03:07
-4

Try saving the audio file to the document directory and then play the sound. This works for me.

    func playMusic() {
        let url = NSBundle.mainBundle().URLForResource("Audio", withExtension: "mp3")!
        let data = NSData(contentsOfURL: url)!
        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
        AVAudioSession.sharedInstance().setActive(true, error: nil)
        audioPlayer = AVAudioPlayer(data: data, fileTypeHint: AVFileTypeMPEGLayer3, error: nil)
        audioPlayer.prepareToPlay()
        audioPlayer.play()
    }

Manav
  • 2,284
  • 1
  • 14
  • 27
  • Can you provide code to save audio file to document? Please – Alok Subedi Jul 29 '19 at 11:28
  • do{ let documentDirectory = try fileManager.url(for: .documentDirectory, in: .userDomainMask, appropriateFor:nil, create:false) let pathToSave = "\Music.mp3" let fileURL = documentDirectory.appendingPathComponent(pathToSave) try data.write(to: fileURL) }catch { print(error) } – Manav Jul 29 '19 at 12:31