24

I'm trying to use AVFoundation to crop videos I'm recording. So lets say i create a AVCaptureVideoPreviewLayer and set the frame to be 300x300.

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer     layerWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.delegate = self;
captureVideoPreviewLayer.frame = CGRectMake(0,0, 300, 300);
[previewView.layer addSublayer:captureVideoPreviewLayer];

The user sees the video cropped. I'd like to save the video exactly the way the user is viewing it. Using AVCaptureMovieFileOutput, the video obviously gets saved without cropping. I was considering using a AVCaptureVideoDataOutput to intercept the frames and crop them myself, but I was wondering if there is a more efficient way to do this, perhaps with AVExportSession and using an AVVideoComposition.

Any guidance would be appreciated.

haider
  • 2,418
  • 3
  • 24
  • 26
  • Were you able to get the output to look as crisp as the preview layer? That's been a problem, matching the sharpness of AVLayerVideoGravityResizeAspectFill. – Crashalot Feb 08 '16 at 21:19

2 Answers2

31

Soemthing like this. 99% of this code just sets it up to do a custom CGAffineTransform, and then save out the result.

I'm assuming that you want the cropped video to take up full size/width of the output - so that e.g a Scale Affine is the correct solution (you zoom in on the video, giving the effect of having cropped + resized).

AVAsset* asset = // your input

AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain];
videoComposition.renderSize = CGSizeMake(320, 240);
videoComposition.frameDuration = CMTimeMake(1, 30);

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform finalTransform = // setup a transform that grows the video, effectively causing a crop
    [transformer setTransform:finalTransform atTime:kCMTimeZero];
    instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=url3;
exporter.outputFileType=AVFileTypeQuickTimeMovie;

[exporter exportAsynchronouslyWithCompletionHandler:^(void){}];
fluidsonic
  • 4,655
  • 2
  • 24
  • 34
Adam
  • 32,900
  • 16
  • 126
  • 153
  • 2
    Adam, thanks for the post. I actually did exactly that and was going to post the answer, but you beat me to the punch! Thank you for the help regardless...I think others will definitely benefit. – haider Mar 09 '11 at 07:29
  • Could you elaborate on getting the AVAsset input. Do you run this on the movie file after its been saved with AVCaptureMovieFileOutput? – Stephen Handley Mar 07 '12 at 21:19
  • AVAsset is Apple's catch-all class that wraps any kind of input you could provide to AVFoundation. Off the top of my head, you want to look at the various subclasses that exist in AVF (go to any of Apple's docs pages for AVF, and click on any link to "Class Reference" and you'll get the full list) - they're all named "AV[something]Asset" IIRC. – Adam Mar 08 '12 at 10:12
  • Also, IIRC some thing extend AVAsset for convenience, although it's not obvious unless you look at the docs. Basically, anything that's conceivably representing an asset ... pretty much extends AVAsset – Adam Mar 08 '12 at 10:13
  • Hey,Adam, I am using the same code, getting audio but no video display. can you tell me any additional code for cropping video? – Vishal Khatri Jan 07 '14 at 10:19
  • 1
    Can we get an example of a "transform that grows the video, effectively causing a crop"? – etayluz Jul 17 '15 at 15:34
  • What is compositionVideoTrack used for? This code doesn't compile and raises many warnings. – etayluz Jul 17 '15 at 15:50
  • @etayluz here is a post to help with growing/cropping video: http://stackoverflow.com/questions/3968879/simultaneous-avcapturevideodataoutput-and-avcapturemoviefileoutput – Crashalot Feb 08 '16 at 21:18
  • @Adam Can u please provide a working example. A demo for this in swift would be really life saving – Ankit Kumar Gupta Apr 13 '17 at 07:55
  • 1
    @AnkitKumarGupta You have a working example - you replied to it. If you have a different problem, ask a new question. That's what this site is for. – Adam Apr 18 '17 at 08:57
  • @Adam Thanks I was able to create this for my self. You can have a look and tell me if it could be improved or something that I'm doing wrong. > https://github.com/ankit-betterbutter/CustomCamera – Ankit Kumar Gupta Apr 18 '17 at 09:16
11

ios7 added a specific Layer instruction just for cropping.

videolayerInstruction setCropRectangle:atTime:

_mike

nibeck
  • 669
  • 1
  • 8
  • 15
  • 2
    This does not provide the expected results. FYI. This setting will crop a rect, from the original video pixel size. Unrelated to screen size on device. – Maxim Veksler May 07 '14 at 23:26
  • @nibeck, could you kindly include a code block that demonstrates how to use this? – etayluz Jul 17 '15 at 15:24
  • I tried this. But did not work for me. Had to switch back to translation transform. – Sreeraj Aug 03 '15 at 04:29