12

EDIT: The strangest thing: it seems that when running this code from a full app everything works, but I was always running the creation of the movie from my unit tests, and only there it didn't work. Trying to figure out why is that...

I'm trying to combine video + audio + text using AVMutableComposition and export it to a new video.

My code is based on the AVEditDemo from WWDC '10

I added a purple background to the CATextLayer so I can know for a fact it is exported to the movie, but no text is shown... I tried playing with various fonts, position, color definitions, but nothing helped, so I decided to post the code here and see if anyone stumbled across something similar and can tell me what I'm missing.

Here's the code (self.audio and self.video are AVURLAssets):

CMTime exportDuration = self.audio.duration;

AVMutableComposition *composition = [[AVMutableComposition alloc] init];

AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *videoTrack = [[self.video tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

// add the video in loop until the audio ends
CMTime currStartTime = kCMTimeZero;
while (CMTimeCompare(currStartTime, exportDuration) < 0) {
    CMTime timeRemaining = CMTimeSubtract(exportDuration, currStartTime);
    CMTime currLoopDuration = self.video.duration;

    if (CMTimeCompare(currLoopDuration, timeRemaining) > 0) {
        currLoopDuration = timeRemaining;
    }
    CMTimeRange currLoopTimeRange = CMTimeRangeMake(kCMTimeZero, currLoopDuration);

    [compositionVideoTrack insertTimeRange:currLoopTimeRange ofTrack:videoTrack
                                    atTime:currStartTime error:nil];

    currStartTime = CMTimeAdd(currStartTime, currLoopDuration);
}

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

AVAssetTrack *audioTrack = [self.audio.tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.audio.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];

AVMutableVideoComposition *videoComposition;

// the text layer part - THIS IS THE PART THAT DOESN'T WORK WELL
CALayer *animatedTitleLayer = [CALayer layer];
CATextLayer *titleLayer = [[CATextLayer alloc] init];
titleLayer.string = @"asdfasdf";
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.bounds = CGRectMake(0, 0, self.video.naturalSize.width / 2, self.video.naturalSize.height / 2);
titleLayer.opacity = 1.0;
titleLayer.backgroundColor = [UIColor purpleColor].CGColor;

[animatedTitleLayer addSublayer:titleLayer];
animatedTitleLayer.position = CGPointMake(self.video.naturalSize.width / 2.0, self.video.naturalSize.height / 2.0);

// build a Core Animation tree that contains both the animated title and the video.
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height);
videoLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:animatedTitleLayer];

videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, exportDuration);
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];

passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
videoComposition.instructions = [NSArray arrayWithObject:passThroughInstruction];

videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = self.video.naturalSize;

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];

exportSession.videoComposition = videoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:self.outputFilePath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;

[exportSession exportAsynchronouslyWithCompletionHandler:^() {
    // save the video ...
}];
yonix
  • 11,665
  • 7
  • 34
  • 52
  • I am not sure.. May be this could help..http://stackoverflow.com/questions/7205820/iphone-watermark-on-recorded-video – Dilip Rajkumar Apr 28 '12 at 15:30
  • Thanks, I'm doing something pretty much the same as what's mentioned there. If someone spots the critical difference between my code and the one in the relevant answer I'd be super grateful. – yonix Apr 29 '12 at 08:19
  • Have you tried setting the font size? – Jeshua Lacock Apr 30 '12 at 00:55
  • yes. as mentioned in the edit, this code successfully ran when I was running it with my full app, but when I created the video from a unit-test's code (using OCUnit), it didn't - still figuring out why is that. – yonix Apr 30 '12 at 06:06

4 Answers4

2

I ran into the same issue in a different context. In my case, I had moved preparation of the AVMutableComposition to a background thread. Moving that part of preparation back to the main queue/thread made CATextLayer overlays work properly again.

This likely doesn't exactly apply to your unit testing context, but my guess is that CATextLayer/AVFoundation depend on some part of UIKit/AppKit being running/available (a drawing context? a current screen?) in that thread context, which might explain the failure we are both seeing.

Adam Preble
  • 2,162
  • 17
  • 28
  • Hi Adam do you know if it's possible to merge images and videos using `AVMutableCompositionTracks` (similar to how you merge video together), or do you need `CALayers` to incorporate images into a merged video. Question here: http://stackoverflow.com/questions/34937862/merge-videos-images-in-avmutablecomposition-using-avmutablecompositiontrack – Crashalot Jan 22 '16 at 03:00
1

I had the problem that almost everything rendered great, also images with CALayer and content set to CGImage. Except the CGTextLayer text, if I set a background color to CGTextLayer, a beginTime and duration that was perfectly rendered too - just the actual text didn't want to appear. That was all on the simulator, then I run it on the phone: And it was perfect.

Conclusion: The simulator renders nice videos... Until you use CATextLayer.

Jan
  • 1,032
  • 11
  • 26
0

Further investigation is needed, but AFAICT right now CATextLayer inside an AVMutableVideoComposition simply doesn't work from within a logic unit tests target, and this feature must be tested from a regular target.

yonix
  • 11,665
  • 7
  • 34
  • 52
0

My problem was that I needed to set contentsGravity = kCAGravityBottomLeft -- otherwise my text was off the screen.

John Albano
  • 189
  • 3
  • 10