68

I've been trying to write a video+audio using AVAssetWriter and AVAssetWriterInputs.

I read multiple posts in this forum of people saying they were able to accomplish that, but it is not working for me. If I just write video then the code is doing its job very well. When I add audio the output file is corrupted and cannot be reproduced.

Here is part of my code:

Setting up AVCaptureVideoDataOutput and AVCaptureAudioDataOutput:

NSError *error = nil;

// Setup the video input
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
// Setup the video output
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
_videoOutput.alwaysDiscardsLateVideoFrames = NO;
_videoOutput.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];     

// Setup the audio input
AVCaptureDevice *audioDevice     = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error ];     
// Setup the audio output
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];

// Create the session
_capSession = [[AVCaptureSession alloc] init];
[_capSession addInput:videoInput];
[_capSession addInput:audioInput];
[_capSession addOutput:_videoOutput];
[_capSession addOutput:_audioOutput];

_capSession.sessionPreset = AVCaptureSessionPresetLow;     

// Setup the queue
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[_videoOutput setSampleBufferDelegate:self queue:queue];
[_audioOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

Setting up AVAssetWriter and associating both audio and video AVAssetWriterInputs to it:

- (BOOL)setupWriter {
    NSError *error = nil;
    _videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL 
                                             fileType:AVFileTypeQuickTimeMovie
                                                error:&error];
    NSParameterAssert(_videoWriter);


    // Add video input
    NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
                                                 [NSNumber numberWithDouble:128.0*1024.0], AVVideoAverageBitRateKey,
                                                        nil ];

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:192], AVVideoWidthKey,
                                              [NSNumber numberWithInt:144], AVVideoHeightKey,
                                              videoCompressionProps, AVVideoCompressionPropertiesKey,
                                              nil];

    _videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                            outputSettings:videoSettings] retain];


    NSParameterAssert(_videoWriterInput);
    _videoWriterInput.expectsMediaDataInRealTime = YES;


    // Add the audio input
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;          
    // Both type of audio inputs causes output video file to be corrupted.
    if (NO) {
        // should work from iphone 3GS on and from ipod 3rd generation
        audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                              [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
                                     [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                              [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                              [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                              [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                              nil];
    } else {
        // should work on any device requires more space
        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:                       
                              [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                                    [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                              [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                              [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,                                      
                              [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                 nil ];
    } 

    _audioWriterInput = [[AVAssetWriterInput 
                            assetWriterInputWithMediaType: AVMediaTypeAudio 
                  outputSettings: audioOutputSettings ] retain];

    _audioWriterInput.expectsMediaDataInRealTime = YES;

    // add input
    [_videoWriter addInput:_videoWriterInput];
    [_videoWriter addInput:_audioWriterInput];

    return YES;
}

here are functions to start/stop video recording

- (void)startVideoRecording
{
    if (!_isRecording) {
        NSLog(@"start video recording...");
        if (![self setupWriter]) {
             return;
        }
        _isRecording = YES;
    }
}

- (void)stopVideoRecording
{
    if (_isRecording) {
        _isRecording = NO;

        [_videoWriterInput markAsFinished];
        [_videoWriter endSessionAtSourceTime:lastSampleTime];

        [_videoWriter finishWriting];

        NSLog(@"video recording stopped");
    }
}

And finally the CaptureOutput code

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    if (!CMSampleBufferDataIsReady(sampleBuffer)) {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        return;
    }


    if (_isRecording == YES) {
        lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        if (_videoWriter.status != AVAssetWriterStatusWriting ) {
            [_videoWriter startWriting];
            [_videoWriter startSessionAtSourceTime:lastSampleTime];
        }

        if (captureOutput == _videoOutput) {
            [self newVideoSample:sampleBuffer];
        }

        /*
        // If I add audio to the video, then the output file gets corrupted and it cannot be reproduced
        } else {
            [self newAudioSample:sampleBuffer];
        }
    */
    }
}

- (void)newVideoSample:(CMSampleBufferRef)sampleBuffer
{     
    if (_isRecording) {
        if (_videoWriter.status > AVAssetWriterStatusWriting) {
             NSLog(@"Warning: writer status is %d", _videoWriter.status);
             if (_videoWriter.status == AVAssetWriterStatusFailed)
                  NSLog(@"Error: %@", _videoWriter.error);
             return;
        }

        if (![_videoWriterInput appendSampleBuffer:sampleBuffer]) {
             NSLog(@"Unable to write to video input");
        }
    }
}



- (void)newAudioSample:(CMSampleBufferRef)sampleBuffer
{     
    if (_isRecording) {
        if (_videoWriter.status > AVAssetWriterStatusWriting) {
             NSLog(@"Warning: writer status is %d", _videoWriter.status);
             if (_videoWriter.status == AVAssetWriterStatusFailed)
                  NSLog(@"Error: %@", _videoWriter.error);
             return;
        }

        if (![_audioWriterInput appendSampleBuffer:sampleBuffer]) {
             NSLog(@"Unable to write to audio input");
        }
    }
}

I would be very glad if someone could find which is the problem in this code.

derpoliuk
  • 1,756
  • 2
  • 27
  • 43
kalos
  • 793
  • 1
  • 6
  • 6
  • I'm having problems with my audio setting with code very similar to yours. My app will record video but as soon as I tell the AVAssetWritterInput i've made for audio to appendSampleBuffer: it tells me 'Input buffer must be in an uncompressed format when outputSettings is not nil'. Did you ever come across this problem? It's driving my slightly nutty! – Baza207 Aug 29 '12 at 14:11
  • 1
    hello kalos, the audio input in your example is from the microphone or the application itself? – justicepenny Nov 19 '12 at 01:57
  • @kalos brother can you tell me how can we use videoURL. – Gajendra Rawat Dec 24 '13 at 10:52
  • @Baza207 Were you able to fix the problem? I am also going nuts try to figure this one out. – B K May 26 '16 at 08:14
  • Question part is helped me even I am using swift 5 instead of Obj-C. In setup of session I did not use "audioWriterInput.expectsMediaDataInRealTime = true". Video recording was ending up short when record has been made with audio was having lost frames. – Hope Jul 19 '19 at 07:05

2 Answers2

25

In startVideoRecording I call (I assume you are calling this at some point)

[_capSession startRunning] ;

In stopVideoRecording I do not call

[_videoWriterInput markAsFinished];
[_videoWriter endSessionAtSourceTime:lastSampleTime];

The markAsFinished is more for use with the block style pull method. See requestMediaDataWhenReadyOnQueue:usingBlock in AVAssetWriterInput for an explanation. The library should calculate the proper timing for interleaving the buffers.

You do not need to call endSessionAtSrouceTime. The last time stamp in the sample data will be used after the call to

[_videoWriter finishWriting];

I also explicitly check for the type of capture output.

else if( captureOutput == _audioOutput) {
    [self newAudioSample:sampleBuffer]; 
}

Here is what I have. The audio and video come through for me. It is possible I changed something. If this does not work for you then I will post everything I have.

-(void) startVideoRecording
    {
        if( !_isRecording )
         {
            NSLog(@"start video recording...");
            if( ![self setupWriter] ) {
                NSLog(@"Setup Writer Failed") ;

                return;
            }

            [_capSession startRunning] ;
            _isRecording = YES;
         }
    }

    -(void) stopVideoRecording
    {
        if( _isRecording )
         {
            _isRecording = NO;

            [_capSession stopRunning] ;

            if(![_videoWriter finishWriting]) { 
                NSLog(@"finishWriting returned NO") ;
            }
            //[_videoWriter endSessionAtSourceTime:lastSampleTime];
            //[_videoWriterInput markAsFinished];
            //[_audioWriterInput markAsFinished];

            NSLog(@"video recording stopped");
         }
    }
Steve McFarlin
  • 3,576
  • 1
  • 25
  • 24
  • 1
    Thank you very much Steve, your hints were very very helpful. Now video recording has also audio! – kalos Jan 08 '11 at 16:33
  • 3
    @Steve and kalos can you give correct working code where its possible to record both audio and video? – sach Nov 06 '11 at 16:44
  • You may want to take a look at AVCamDemo from the WWDC 2010 sample code. – Steve McFarlin Nov 14 '11 at 00:24
  • Hi Steve, I already use the same code to live video broadcast video to rtsp server using the code of "Encoder demo" of "GTCL" site. please help me.. get crash in "onNALU" method. after adding Audio writing code. – Name is Nilay Apr 17 '14 at 11:17
  • 1
    apple should make capturing audio/video more complex. Why make it simple when you can make it complex? – Duck Dec 12 '16 at 22:56
  • Hi @SteveMcFarlin, I am not able to set audio setting with video. Can you please provide any sample. The sample will help a lot to me. – Asif Raza Jul 30 '18 at 09:30
7

First, do not use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], as it is not the native format of the camera. use [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]

Also, you should always check before calling startWriting that it isn't already running. You do not need to set session end time, as stopWriting will do that.

coletrain
  • 2,809
  • 35
  • 43
JeffMc
  • 71
  • 2
  • 1
    What's wrong with kCVPixelFormatType_32BGRA? If you use the native format, you will most likely end up converting to BGRA anyway via shaders, which is probably what Apple does for you when you specify BGRA... – jjxtra Jan 26 '15 at 15:18