I'm currently developing an iOS app using PBJVision and am trying to save landscape videos in the correct orientation.
I use the writeVideoAtPathToSavedPhotosAlbum: completionBlock: method to save the photo to the device:
[self.library writeVideoAtPathToSavedPhotosAlbum: videoPathURL
completionBlock:^(NSURL* assetURL, NSError* error) {
[self encodeVideoOrientation:assetURL]; // method below
if (error.code == 0) {
[self.library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
// assign the video to the album
}
failureBlock:^(NSError* error) {
NSLog(@"Failed to save video to library");
}];
} else {
}
}];
and then I use the saved video's assetURL in a method I wrote called encodeVideoOrientation with the intention of overwriting the file with the rotated video. Note that I apply the same rotation (90 degrees) to all videos for testing - yet no transformation is applied to the video, regardless of the orientation.
- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL
{
CGAffineTransform rotationTransform;
CGAffineTransform transformToApply = videoAsset.preferredTransform;
CGSize renderSize;
CGFloat newWidth;
CGFloat newHeight;
float currentVideoRotation;
AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil];
AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CGSize size = [sourceVideoTrack naturalSize];
// below, I test rotating the video regardless of orientation to check if portrait videos become rotated, which they do not
if (self.currentOrientationByAccelerometer == LANDSCAPE_RIGHT){
currentVideoRotation = M_PI; // for testing, video should appear upside down
newWidth = size.height;
newHeight = size.width;
NSLog(@"Should be landscape right");
} else if (self.currentOrientationByAccelerometer == LANDSCAPE_LEFT){
currentVideoRotation = M_PI; //for testing, should appear upside down
newWidth = size.height;
newHeight = size.width;
NSLog(@"Should be landscape left");
} else { //default to portrait
currentVideoRotation = M_PI;
newWidth = size.width;
newHeight = size.height;
NSLog(@"Should be portrait");
}
renderSize = CGSizeMake(newWidth, newHeight);
rotationTransform = CGAffineTransformMakeRotation(currentVideoRotation);
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:rotationTransform]; //Rotations here?
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:transformToApply atTime:kCMTimeZero]; //Rotations here?
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = renderSize;
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];
AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
NSString* videoName = @"export.mov";
NSString *anOutputFileURLPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL * exportUrl = [NSURL fileURLWithPath:anOutputFileURLPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:anOutputFileURLPath])
{
[[NSFileManager defaultManager] removeItemAtPath:anOutputFileURLPath error:nil];
}
assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = videoComposition;
[assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {
switch (assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(@"Export Complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
// export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Cancelled");
NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
// export cancelled
break;
}
}];
}
The export completes successfully, but no transformation seems to be applied to the video regardless of the CGAffineTransform.
I referenced this SO post in writing my code:
iOS AVFoundation: Setting Orientation of Video
And this appears to be an open issue according to this post on the project's github:
https://github.com/piemonte/PBJVision/issues/84
How do I save a rotated a video captured using PBJVision?