39

I have to export a movie from my iPhone application which contains UIImage from an NSArray and add some audio files in .caf format that have to start at pre-specified times. Now I have been able to use the AVAssetWriter (after going through many questions and answers on this and other sites) to export the video portion containing the images but cant seem to find a way to add the audio files to complete the movie.

Here is what I have gotten so far

-(void) writeImagesToMovieAtPath:(NSString *) path withSize:(CGSize) size
{
    NSLog(@"Write Started");

    NSError *error = nil;

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];    
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];

    AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
                                    assetWriterInputWithMediaType:AVMediaTypeVideo
                                    outputSettings:videoSettings] retain];


    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                            assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                 sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    videoWriterInput.expectsMediaDataInRealTime = YES;
    [videoWriter addInput:videoWriterInput];

    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage.

    int frameCount = 0;

    for(UIImage * img in imageArray)
    {
            buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:size];

            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) 
            {
                if (adaptor.assetWriterInput.readyForMoreMediaData) 
                {
                    printf("appending %d attemp %d\n", frameCount, j);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];

                    if(buffer)
                        CVBufferRelease(buffer);
                    [NSThread sleepForTimeInterval:0.05];
                } 
                else 
                {
                    printf("adaptor not ready %d, %d\n", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d\n", frameCount, j);
            }
            frameCount++;
        }
    }

    //Finish the session:
    [videoWriterInput markAsFinished];  
    [videoWriter finishWriting];
    NSLog(@"Write Ended");
}

And now the code for pixelBufferFromCGImage

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                      size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                      &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                             size.height, 8, 4*size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

So can you help me out regarding how to add the audio files and how to make buffers for them and the adaptor and input settings etc

If this approach might cause a problem guide me about how to use a AVMutableComposition to use the image array for video export

MuTaTeD
  • 861
  • 2
  • 8
  • 13
  • Ok I have been able to add the audio files using AVAssetReaders and AVAssetWriterInputs, however when I add the audio files, they start one after the other without any pause (one finishes and th enext one starts) instead of starting at predetrimned times, so How do I tell AVAssetWriter to take the input at a certain time. This because as I understand the [startSessionAtSourceTime] is for determining the time of the source, not time in the destination movie, So any hints – MuTaTeD Apr 14 '11 at 13:29
  • You're awesome for posting such detailed solutions for others. – TigerCoding Nov 30 '11 at 13:43
  • is this also working with 1080*1920 images ? Because I have implemented same code and its working well with 720*1280 (720/16) but not working with those video width whose throw result in floting value (video width/16) any suggestion ? – Dipen Chudasama Dec 04 '15 at 16:33

2 Answers2

18

I ended up exporting the video separately using the above code and added the audio files separately using AVComposition & AVExportSession. Here is the code

-(void) addAudioToFileAtPath:(NSString *) filePath toPath:(NSString *)outFilePath
{
    NSError * error = nil;

    AVMutableComposition * composition = [AVMutableComposition composition];


    AVURLAsset * videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:filePath] options:nil];

    AVAssetTrack * videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                                preferredTrackID: kCMPersistentTrackID_Invalid];

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero
                                     error:&error];     

    CMTime audioStartTime = kCMTimeZero;
    for (NSDictionary * audioInfo in audioInfoArray)
    {
        NSString * pathString = [audioInfo objectForKey:audioFilePath];
        AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

        AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                    preferredTrackID: kCMPersistentTrackID_Invalid];

        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];      

        audioStartTime = CMTimeAdd(audioStartTime, CMTimeMake((int) (([[audioInfo objectForKey:audioDuration] floatValue] * kRecordingFPS) + 0.5), kRecordingFPS));
    }
    AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];  
    assetExport.videoComposition = mutableVideoComposition;

    assetExport.outputFileType =AVFileTypeQuickTimeMovie;// @"com.apple.quicktime-movie";
    assetExport.outputURL = [NSURL fileURLWithPath:outFilePath];

    [assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (assetExport.status) 
         {
             case AVAssetExportSessionStatusCompleted:
//                export complete 
                 NSLog(@"Export Complete");
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
//                export error (see exportSession.error)  
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
//                export cancelled  
                 break;
         }
     }];    
}
MuTaTeD
  • 861
  • 2
  • 8
  • 13
  • Can you please replace the "for" loop with a single "audioInfo" dictionary which has all the values which need to be set so that it becomes more copy-paste friendly? :) – Chintan Patel Jun 12 '11 at 16:51
  • @Chintan Patel: I had to add variable lengths of different audio files against different portions of the video in the generated movie (I end up using the complete audio files instead), so I made a dictionary for each audio file to be included and added them all to the array (audioInfoArray). The audioInfo dictionary contains the following keys **audioFilePath** and **audioDuration**, _NSString_ & _float_ respectively. – MuTaTeD Jun 12 '11 at 20:34
  • 2
    There's a bug with the code. What is the mutableVideoComposition variable? assetExport.videoComposition = mutableVideoComposition; It's not referenced anywhere else. – Paul Solt May 24 '12 at 19:30
  • That might be the compositionVideoTrack variable, howeer I am not sure because I have not touched that code for nearly a year now – MuTaTeD May 28 '12 at 10:29
  • @PaulSolt You can just comment that line out. It's dead and without the code should work fine. – Eric Brotto Oct 14 '12 at 13:57
  • Can u form a video file with single image constantly and merge audio to it.And size should be of audio file size.@MuTaTeD Im not getting what u written here. – Himanshu May 13 '13 at 14:03
  • The line `assetExport.videoComposition = mutableVideoComposition;` isn't needed. The `assetExport` object receives the composition data with `initWithAsset: composition`. – Boz Jul 28 '15 at 13:59
4

Can you please replace the "for" loop with a single "audioInfo" dictionary which has all the values which need to be set so that it becomes more copy-paste friendly? :)

If you just want to add a single audio file, the following code should replace the for loop :

NSString * pathString = [self getAudioFilePath];
AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                    preferredTrackID: kCMPersistentTrackID_Invalid];

[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:kCMTimeZero error:&error];      
MuTaTeD
  • 861
  • 2
  • 8
  • 13