59

I have an app that allows a user to record a video with UIImagePickerController and then upload it to YouTube. The problem is that the video file that UIImagePickerController creates is HUGE, even when the video is only 5 seconds long. For example, a 5 second long video is 16-20 megabytes. I want to keep the video in 540 or 720 quality, but I want to reduce the file size.

I've been experimenting with AVFoundation and AVAssetExportSession to try to get a smaller file size. I've tried the following code:

AVAsset *video = [AVAsset assetWithURL:videoURL];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [pathToSavedVideosDirectory URLByAppendingPathComponent:@"vid1.mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    NSLog(@"done processing video!");
}];

But this hasn't reduced the file size at all. I know what I'm doing is possible because in Apple's Photos app, when you select "share on YouTube", will automatically process the video file so its small enough to upload. I want to do the same thing in my app.

How can I accomplish this?

The iOSDev
  • 5,237
  • 7
  • 41
  • 78
zakdances
  • 22,285
  • 32
  • 102
  • 173
  • Does the upload from Photos keep the quality and resolution intact? I suspect it reduces both to make the video smaller – davidfrancis Aug 03 '12 at 22:23
  • Nope, it preserves the video as it's upload. YouTube is capable of 1080p video. – zakdances Aug 03 '12 at 23:09
  • Does making the file output type AVFileTypeQuickTimeMovie reduce the size to you liking? or even try yourPickerController.videoQuality property to try reduce its quality and hence size? – Just a coder Aug 04 '12 at 01:09
  • In my post I note that I want to keep the quality at 720 or 540. I'll try converting it to a MOV, but from what I understand its a much bigger file format than MP4 – zakdances Aug 04 '12 at 02:03
  • The title is misleading since you are not using UIImagePickerController anywhere, you should change it to avoid confusion for future users – thibaut noah Jan 26 '17 at 11:00

13 Answers13

68

With AVCaptureSession and AVAssetWriter you can set the compression settings as such:

NSDictionary *settings = @{AVVideoCodecKey:AVVideoCodecH264,
                           AVVideoWidthKey:@(video_width),
                           AVVideoHeightKey:@(video_height),
                           AVVideoCompressionPropertiesKey:
                               @{AVVideoAverageBitRateKey:@(desired_bitrate),
                                 AVVideoProfileLevelKey:AVVideoProfileLevelH264Main31, /* Or whatever profile & level you wish to use */
                                 AVVideoMaxKeyFrameIntervalKey:@(desired_keyframe_interval)}};

AVAssetWriterInput* writer_input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];

Edit: I guess if you insist on using the UIImagePicker to create the movie in the first place, you'll have to use AVAssetReader's copyNextSampleBuffer and AVAssetWriter's appendSampleBuffer methods to do the transcode.

Shaheen Ghiassy
  • 7,397
  • 3
  • 40
  • 40
jgh
  • 2,017
  • 14
  • 13
  • 4
    Wow...this is really, really good stuff. It's really frustrating that the documentation for this is missing or so hard to find. Why do you have to use copyNextSampleBuffer with a video created with UIImagePicker? Can't you just designate the mp4/mov it generates as a AVAsset and feed that directly into AVAssetWriter? – zakdances Aug 05 '12 at 20:37
  • Sorry, I should have been clearer. That's exactly right, you'll have to use the mp4/mov file you create from the UIImagePicker as the Asset for the AssetReader and then copy all samples from the reader into the writer. – jgh Aug 05 '12 at 20:44
  • When you say "copy all samples" do you mean use copyNextSampleBuffer? – zakdances Aug 06 '12 at 00:31
  • Yes, see http://stackoverflow.com/questions/5240581/how-to-use-avassetreader-and-avassetwriter-for-multiple-tracks-audio-and-video for some example code. – jgh Aug 06 '12 at 00:49
  • Hi, I am having the same issue. But can't understand AVAsset implementation logic. Please help me. If possible please provide some tutorial for the same. – iOS Monster Nov 16 '12 at 11:23
  • @jgh I am new to AVFoundation. I have same issue regarding to Video Compression can you send any sample project regarding this. – Suresh Dec 29 '12 at 09:11
  • @Infaz Not to be rude but you should learn how to read documentation. This goes for the rest of you asking me the same question. – jgh Jul 30 '15 at 17:53
  • Hi @jgh exporting a 60-second video takes about 10-20 seconds now. Besides playing with the `presetName` property, what other values do you suggest experimenting with to minimize export time while trying to preserve video quality? – Crashalot Jan 22 '16 at 01:29
  • Yes taking time is one constraint and also quality is another. can we have any other alternate robust solution (like facebook and other app does) with latest release of technology, as this question is too old?. Thanks – Jagdev Sendhav Mar 13 '18 at 17:24
  • The "right" way to do the export is to encode the video at the right quality level the first time as you're recording it. Recording it at ultra quality and then transcoding it is a waste of time unless you have reasons to do it. – jgh May 24 '18 at 17:41
22

yourfriendzak is right: Setting cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;isn't the solution here. The solution is to reduce the data rate, or bit rate, which is what jgh is suggesting.

I have, three methods. The first method handles the UIImagePicker delegate method:

// For responding to the user accepting a newly-captured picture or movie
- (void) imagePickerController: (UIImagePickerController *) picker didFinishPickingMediaWithInfo: (NSDictionary *) info {

// Handle movie capture
NSURL *movieURL = [info objectForKey:
                            UIImagePickerControllerMediaURL];

NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self randomString]] stringByAppendingString:@".mp4"]];

// Compress movie first
[self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];
}

The second method converts the video to a lower bitrate, not to lower dimensions.

- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
                               outputURL:(NSURL*)outputURL
{
//setup video writer
AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];

AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CGSize videoSize = videoTrack.naturalSize;

NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];

NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];

AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                         assetWriterInputWithMediaType:AVMediaTypeVideo
                                         outputSettings:videoWriterSettings];

videoWriterInput.expectsMediaDataInRealTime = YES;

videoWriterInput.transform = videoTrack.preferredTransform;

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

[videoWriter addInput:videoWriterInput];

//setup video reader
NSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];

AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];

[videoReader addOutput:videoReaderOutput];

//setup audio writer
AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeAudio
                                        outputSettings:nil];

audioWriterInput.expectsMediaDataInRealTime = NO;

[videoWriter addInput:audioWriterInput];

//setup audio reader
AVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];

[audioReader addOutput:audioReaderOutput];    

[videoWriter startWriting];

//start writing from video reader
[videoReader startReading];

[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL);

[videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:
 ^{

     while ([videoWriterInput isReadyForMoreMediaData]) {

         CMSampleBufferRef sampleBuffer;

         if ([videoReader status] == AVAssetReaderStatusReading &&
             (sampleBuffer = [videoReaderOutput copyNextSampleBuffer])) {

             [videoWriterInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         }

         else {

             [videoWriterInput markAsFinished];

             if ([videoReader status] == AVAssetReaderStatusCompleted) {

                 //start writing from audio reader
                 [audioReader startReading];

                 [videoWriter startSessionAtSourceTime:kCMTimeZero];

                 dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL);

                 [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{

                     while (audioWriterInput.readyForMoreMediaData) {

                         CMSampleBufferRef sampleBuffer;

                         if ([audioReader status] == AVAssetReaderStatusReading &&
                             (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) {

                            [audioWriterInput appendSampleBuffer:sampleBuffer];
                                    CFRelease(sampleBuffer);
                         }

                         else {

                             [audioWriterInput markAsFinished];

                             if ([audioReader status] == AVAssetReaderStatusCompleted) {

                                 [videoWriter finishWritingWithCompletionHandler:^(){
                                     [self sendMovieFileAtURL:outputURL];
                                 }];

                             }
                         }
                     }

                 }
                  ];
             }
         }
     }
 }
 ];
}

When successful, the third method, sendMovieFileAtURL: is called, which uploads the compressed video at outputURL to the server.

Note that I've enabled ARC in my project, so you will have to add some release calls if ARC is turned off in yours.

Erik
  • 2,138
  • 18
  • 18
  • Why kind of file size reduction are you seeing with this? Can you post the uncompressed .mov file size along with the mp4 file size after its run through your code? – zakdances Apr 16 '13 at 12:10
  • 1
    This solution is complete and works great. I got a qHD (960x540) 21 second video from 80 MB down to 3 MB with the settings above. Just make sure that your outputURL is a fileURL [NSURL fileURLWithPath:]. And put your cleanup code right after [audioWriterInput markAsFinished]; I couldn't get code in the following 'if' statement to execute, but the videos come out great with minimal artifacting. – jbcaveman Dec 05 '13 at 16:00
  • 1
    Change that to "put your cleanup code right after [videoWriter finishWritingWithCompletionHandler:^(){ } I couldn't get code inside that completion handler to execute..." (wouldn't let me edit after 5 min) – jbcaveman Dec 05 '13 at 16:07
  • @JoelCave The above code doesn't seem to work for me. Code inside of [videoWriter finishWritingWithCompletionHandler is never getting called for me. I saw you also faced the same issue. Did you figure out how to fix this one? Thanks! – Manish Ahuja Jun 09 '14 at 09:30
  • No, I had to work around that issue, but if you figure it out please post the solution. Code is working fine otherwise. – jbcaveman Jun 09 '14 at 13:50
  • For iOS 7, solution to finishWritingWithCompletionHandler: not being called is to make sure videoWriter is retained. I did this by making it a strong property. Set to nil inside finishWritingWithCompletionHandler: For reference see: http://stackoverflow.com/questions/18801965/completion-handler-is-not-called-on-ios7-gm – Scott Carter Jul 01 '14 at 15:57
  • Along with retaining videoWriter (my previous comment), you might also consider calling endSessionAtSourceTime: as mentioned in answer by Mr. T in same thread at http://stackoverflow.com/questions/18801965/completion-handler-is-not-called-on-ios7-gm I have not experienced the occasional problem seen by Mr. T, but calling endSessionAtSourceTime: before finishWritingWithCompletionHandler: shouldn't hurt even though the docs seem to indicate that it shouldn't be needed. – Scott Carter Jul 01 '14 at 16:03
  • I put same code on my side and when i get video from document folder than I found Video there but not get played. am I doing any thing wrong. I just pasted above code and commented sendMovieFileAtURL: as i want to test first. Any suggestion – Soniya Aug 23 '14 at 12:20
  • 3
    @Scott Carter, in my case error after strong property in this *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReader startReading] cannot be called again after reading has already started' in iOS 8 – dev.nikolaz Oct 14 '14 at 16:42
  • @dev.nikolaz Not immediately obvious what would be causing this. I might suggest that you post a new question and include the relevant code and error message. – Scott Carter Oct 14 '14 at 17:30
  • @dev.nikolaz i phase same issue and solve that issue with do some changes in that method... check my answer from http://stackoverflow.com/questions/25564050/how-can-i-compress-a-video-in-ios-using-bit-rate/26381593#26381593 link.... – Paras Joshi Oct 15 '14 at 11:41
  • @Erik I would like to invite you to answer this question http://stackoverflow.com/questions/26423304/avassetwriter-corrupting-the-video-trimmed-by-avassetexportsession – S.J Oct 17 '14 at 10:54
  • Couldn't get this to work on iOS 8: -[AVAssetReader startReading] cannot be called again after reading has already started' – etayluz Apr 27 '15 at 14:32
  • Works like a charm +1 – arunjos007 Oct 06 '16 at 04:48
  • Why does the camera opens up when the status for AVAssetReaderStatus becomes completed???? – Ghazalah Oct 23 '16 at 07:15
  • Any solution for above crash? – Jagdev Sendhav Mar 14 '18 at 06:27
19

On UImagePickerController you have a videoQuality property of UIImagePickerControllerQualityType type, and will be applied to recorded movies as well as to the ones that you picked picked from the library (that happens during transcoding phase).

Or if you have to deal with existent asset (file) not from the library you might want to look at these presets:

AVAssetExportPresetLowQuality
AVAssetExportPresetMediumQuality
AVAssetExportPresetHighestQuality

and

AVAssetExportPreset640x480
AVAssetExportPreset960x540
AVAssetExportPreset1280x720
AVAssetExportPreset1920x1080

and pass one of them to initializer of AVAssetExportSession class. I'm afraid you have to play with those for your particular content as there is no precise description for what is low and medium quality or which quality will be used for 640x480 or for 1280x720 preset. The only useful information in the docs is following:

Export Preset Names for Device-Appropriate QuickTime Files You use these export options to produce QuickTime .mov files with video size appropriate to the current device.

The export will not scale the video up from a smaller size. Video is compressed using H.264; audio is compressed using AAC

Some devices cannot support some sizes.

Aside from that I do not remember having precise control over quality such as framerate or freeform size etc in AVFoundation

I was wrong, there is a way to tweak all parameters you mentions and it is AVAssetWriter indeed: How do I export UIImage array as a movie?

btw, here is a link to a similar question with a code sample: iPhone:Programmatically compressing recorded video to share?

Community
  • 1
  • 1
Sash Zats
  • 5,376
  • 2
  • 28
  • 42
  • I have been trying AVAssetExport but as you mentioned, quality settings for it don't seem to do anything that UImagePickerController doesn't already do with UIImagePickerControllerQualityType. AVAssetExportPresetMediumQuality and UIImagePickerControllerQualityType = medium are VERY low quality 360ps, while the high quality setting seems to be an almost uncompressed 720p video with a unreasonably large file size. I'm pretty sure the answer to my question will involve using AVAssetWriter to alter the framerate and bitrate of the 720p video. – zakdances Aug 04 '12 at 22:34
  • I'm hoping someone with experience with AVAssetWriter can shine some light – zakdances Aug 04 '12 at 22:34
  • I was wrong, there is a way to tweak all parameters you mentions and it is AVAssetWriter indeed: http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie/3742212#3742212 – Sash Zats Aug 05 '12 at 05:55
14

Erik's answer may have been correct at the time he wrote it - but now with iOS8 it's just crashing left and right, I've spent a few hours on it myself.

You need a PhD to work with AVAssetWriter - it's non-trivial: https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_Export.html#//apple_ref/doc/uid/TP40010188-CH9-SW1

There's an amazing library for doing exactly what you want which is just an AVAssetExportSession drop-in replacement with more crucial features like changing the bit rate: https://github.com/rs/SDAVAssetExportSession

Here's how to use it:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{

  SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]]];
  NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
  NSString *documentsDirectory = [paths objectAtIndex:0];
  self.myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                      [NSString stringWithFormat:@"lowerBitRate-%d.mov",arc4random() % 1000]];
  NSURL *url = [NSURL fileURLWithPath:self.myPathDocs];
  encoder.outputURL=url;
  encoder.outputFileType = AVFileTypeMPEG4;
  encoder.shouldOptimizeForNetworkUse = YES;

  encoder.videoSettings = @
  {
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoCompressionPropertiesKey: @
    {
    AVVideoAverageBitRateKey: @2300000, // Lower bit rate here
    AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    },
  };
  encoder.audioSettings = @
  {
  AVFormatIDKey: @(kAudioFormatMPEG4AAC),
  AVNumberOfChannelsKey: @2,
  AVSampleRateKey: @44100,
  AVEncoderBitRateKey: @128000,
  };

  [encoder exportAsynchronouslyWithCompletionHandler:^
  {
    int status = encoder.status;

    if (status == AVAssetExportSessionStatusCompleted)
    {
      AVAssetTrack *videoTrack = nil;
      AVURLAsset *asset = [AVAsset assetWithURL:encoder.outputURL];
      NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
      videoTrack = [videoTracks objectAtIndex:0];
      float frameRate = [videoTrack nominalFrameRate];
      float bps = [videoTrack estimatedDataRate];
      NSLog(@"Frame rate == %f",frameRate);
      NSLog(@"bps rate == %f",bps/(1024.0 * 1024.0));
      NSLog(@"Video export succeeded");
      // encoder.outputURL <- this is what you want!!
    }
    else if (status == AVAssetExportSessionStatusCancelled)
    {
      NSLog(@"Video export cancelled");
    }
    else
    {
      NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
    }
  }];
}
etayluz
  • 15,920
  • 23
  • 106
  • 151
  • 1
    This is an excellent solution to compress an existing video. However, it is missing keys AVVideoWidthKey and AVVideoHeightKey in encoder.videoSettings. To use current, use this code: AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil]; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack *track = [tracks objectAtIndex:0]; encoder.videoSettings = @ { .... AVVideoWidthKey : @(track.naturalSize.width), AVVideoHeightKey: @(track.naturalSize.height), .... } – Thibaud David Oct 21 '15 at 13:51
  • Hey Etayluz with the above settings and including video width and height as suggested by @ThibaudDavid it is converting a video of size 984374 bytes to 7924549 bytes which is 8x larger. So instead of compressing the video it is making it bigger. :( – Mayur Shrivas Oct 24 '15 at 14:58
  • You can either reduce width/height (using same factor to keep ratio) or reduce AVSampleRateKey to meet your needs – Thibaud David Oct 26 '15 at 10:24
  • Hi @ThibaudDavid I've tried to reduce the width and height by multiplying .75 and bitrate from 2300000 to 1960000 but then also 2175522 byte is getting exported to 3938850 bytes. :( – Mayur Shrivas Oct 26 '15 at 11:13
  • 1
    What is your input file bitrate ? If you specify a lower while converting, file should contain less bytes. Try to pass [track estimatedDataRate] / 2 as bitrate for example to be sure your value is inferior – Thibaud David Oct 26 '15 at 12:29
  • @ThibaudDavid but in that case the video is losing it's width and height as black patches are getting displayed on left/right and up/down of the video. – Mayur Shrivas Oct 26 '15 at 15:37
  • @ThibaudDavid do you know if using `AVFileTypeQuickTimeMovie` or `AVFileTypeMPEG4` matters in terms of export time and quality? – Crashalot Jan 22 '16 at 01:26
  • @Crashalot Quality shouldn't be impacted by the container. However AVFileTypeQuickTimeMovie should(untested !) be more quickly I guess, as it is native by Apple. – Thibaud David Jan 22 '16 at 08:35
  • @ThibaudDavid thanks David! Do you happen to know if it's possible to add images/text to merged videos through `AVMutableComposition`, or is using `AVVideoCompositionCoreAnimationTool` the only way? `AVVideoCompositionCoreAnimationTool` seems to severely slow down the export process so trying to avoid it if possible. Question: http://stackoverflow.com/questions/34937862/merge-videos-images-in-avmutablecomposition-using-avmutablecompositiontrack – Crashalot Jan 22 '16 at 08:52
  • @Crashalot I'd got for an UIImageView layer to avoid any processing time. – Thibaud David Jan 22 '16 at 09:30
  • @ThibaudDavid how do you add the UIImageView layer to the video composition without `AVVideoCompositionCoreAnimationTool`? Thanks so much for your help! – Crashalot Jan 22 '16 at 09:58
  • If processing time is important, don't add it to video. Just append it to your layer after playing your video. I don't think you really need to merge them, do you ? If you do, you'll have to use `AVVideoCompositionCoreAnimationTool ` , otherwise, it's just a matter of displaying it at the end... – Thibaud David Jan 22 '16 at 10:25
  • @ThibaudDavid yes we do need to merge them, unfortunately (for sharing purposes). So `AVVideoCompositionCoreAnimationTool` is unavoidable if you want to merge videos and images into one video? Thanks again for all your help. – Crashalot Jan 22 '16 at 19:59
  • @Crashalot, I don't think you can bypass this, but you can always do it in background before uploading it, without being blocking for the user – Thibaud David Jan 24 '16 at 00:47
  • Thanks @ThibaudDavid! I think there is another workaround. One more question if you don't mind. Your help has been very much appreciated. Any suggestions on how to stabilize video? Question here: http://stackoverflow.com/questions/34912050/avoiding-blurriness-at-start-end-of-video-even-after-using-setpreferredvideos – Crashalot Jan 24 '16 at 00:52
  • @Crashalot No idea, sorry ! – Thibaud David Jan 25 '16 at 08:24
  • @ThibaudDavid did you guys get this to work? when I use a video of around 10sec it will crash when trying to save down.. – trdavidson Mar 01 '16 at 03:22
  • @trdavidson Yes, that's working. What is the exception triggered by you crash ? – Thibaud David Mar 01 '16 at 13:26
  • @ThibaudDavid thanks for getting back to me - it doesn't throw any exceptions unfortunately, the app just closes down. It will from time to time (1/4) start throwing memory pressure warning which is odd, as I am running at around 30mbs tops.. Any thoughts? – trdavidson Mar 01 '16 at 23:50
  • @trdavidson Even with breakpoints activated and an exception breakpoint set on "All Exceptions", you won't enter the debugger before crashing ? That's strange with only 30mb of memory used by your device to get a memory warning crashing your app at this point – Thibaud David Mar 02 '16 at 09:17
  • @ThibaudDavid I found the culprit i believe. The crashes occur when I try to export files quickly after each other. What happens, is that the serial queue, which has a static name, is being deallocated because of it. So for me the following did the trick: const char *unique_q = [[NSString stringWithFormat:@"VideoQ %u", arc4random_uniform(1000000)] UTF8String]; self.inputQueue = dispatch_queue_create(unique_q, DISPATCH_QUEUE_SERIAL); What this does is create a serial queue with a unique id, so you never run into any conflict. Hope this helps somebody :) – trdavidson Mar 03 '16 at 03:19
  • The `AVAssetWriter` is really slow. Any suggestion? – Bagusflyer Apr 15 '21 at 13:39
14

Code for Swift 5 and Good Quality

Here's how to do it following code from this link. The problem with the link is it only works with .mov file output, if you want to output a .mp4 file it will crash. The code below lets you get a .mp4 output. It is tried, tested, and works. Example a 15 sec video that is originally 27mb gets reduced to 2mb. If you want better quality raise the bitrate. I have it set at 1250000.

c+p this code:

import AVFoundation

// add these properties
var assetWriter: AVAssetWriter!
var assetWriterVideoInput: AVAssetWriterInput!
var audioMicInput: AVAssetWriterInput!
var videoURL: URL!
var audioAppInput: AVAssetWriterInput!
var channelLayout = AudioChannelLayout()
var assetReader: AVAssetReader?
let bitrate: NSNumber = NSNumber(value: 1250000) // *** you can change this number to increase/decrease the quality. The more you increase, the better the video quality but the the compressed file size will also increase

// compression function, it returns a .mp4 but you can change it to .mov inside the do try block towards the middle. Change assetWriter = try AVAssetWriter ... AVFileType.mp4 to AVFileType.mov
func compressFile(_ urlToCompress: URL, completion:@escaping (URL)->Void) {
    
    var audioFinished = false
    var videoFinished = false
    
    let asset = AVAsset(url: urlToCompress)
    
    //create asset reader
    do {
        assetReader = try AVAssetReader(asset: asset)
    } catch {
        assetReader = nil
    }
    
    guard let reader = assetReader else {
        print("Could not iniitalize asset reader probably failed its try catch")
        // show user error message/alert
        return
    }
    
    guard let videoTrack = asset.tracks(withMediaType: AVMediaType.video).first else { return }
    let videoReaderSettings: [String:Any] = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32ARGB]
    
    let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    
    var assetReaderAudioOutput: AVAssetReaderTrackOutput?
    if let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first {
        
        let audioReaderSettings: [String : Any] = [
            AVFormatIDKey: kAudioFormatLinearPCM,
            AVSampleRateKey: 44100,
            AVNumberOfChannelsKey: 2
        ]
        
        assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: audioReaderSettings)
        
        if reader.canAdd(assetReaderAudioOutput!) {
            reader.add(assetReaderAudioOutput!)
        } else {
            print("Couldn't add audio output reader")
            // show user error message/alert
            return
        }
    }
    
    if reader.canAdd(assetReaderVideoOutput) {
        reader.add(assetReaderVideoOutput)
    } else {
        print("Couldn't add video output reader")
        // show user error message/alert
        return
    }
    
    let videoSettings:[String:Any] = [
        AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: self.bitrate],
        AVVideoCodecKey: AVVideoCodecType.h264,
        AVVideoHeightKey: videoTrack.naturalSize.height,
        AVVideoWidthKey: videoTrack.naturalSize.width,
        AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill
    ]
    
    let audioSettings: [String:Any] = [AVFormatIDKey : kAudioFormatMPEG4AAC,
                                       AVNumberOfChannelsKey : 2,
                                       AVSampleRateKey : 44100.0,
                                       AVEncoderBitRateKey: 128000
    ]
    
    let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioSettings)
    let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoSettings)
    videoInput.transform = videoTrack.preferredTransform
    
    let videoInputQueue = DispatchQueue(label: "videoQueue")
    let audioInputQueue = DispatchQueue(label: "audioQueue")
    
    do {
        
        let formatter = DateFormatter()
        formatter.dateFormat = "yyyy'-'MM'-'dd'T'HH':'mm':'ss'Z'"
        let date = Date()
        let tempDir = NSTemporaryDirectory()
        let outputPath = "\(tempDir)/\(formatter.string(from: date)).mp4"
        let outputURL = URL(fileURLWithPath: outputPath)
        
        assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mp4)
        
    } catch {
        assetWriter = nil
    }
    guard let writer = assetWriter else {
        print("assetWriter was nil")
        // show user error message/alert
        return
    }
    
    writer.shouldOptimizeForNetworkUse = true
    writer.add(videoInput)
    writer.add(audioInput)
    
    writer.startWriting()
    reader.startReading()
    writer.startSession(atSourceTime: CMTime.zero)
    
    let closeWriter:()->Void = {
        if (audioFinished && videoFinished) {
            self.assetWriter?.finishWriting(completionHandler: { [weak self] in
                
                if let assetWriter = self?.assetWriter {
                    do {
                        let data = try Data(contentsOf: assetWriter.outputURL)
                        print("compressFile -file size after compression: \(Double(data.count / 1048576)) mb")
                    } catch let err as NSError {
                        print("compressFile Error: \(err.localizedDescription)")
                    }
                }
                
                if let safeSelf = self, let assetWriter = safeSelf.assetWriter {
                    completion(assetWriter.outputURL)
                }
            })
            
            self.assetReader?.cancelReading()
        }
    }
    
    audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
        while(audioInput.isReadyForMoreMediaData) {
            if let cmSampleBuffer = assetReaderAudioOutput?.copyNextSampleBuffer() {
                
                audioInput.append(cmSampleBuffer)
                
            } else {
                audioInput.markAsFinished()
                DispatchQueue.main.async {
                    audioFinished = true
                    closeWriter()
                }
                break;
            }
        }
    }
    
    videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
        // request data here
        while(videoInput.isReadyForMoreMediaData) {
            if let cmSampleBuffer = assetReaderVideoOutput.copyNextSampleBuffer() {
                
                videoInput.append(cmSampleBuffer)
                
            } else {
                videoInput.markAsFinished()
                DispatchQueue.main.async {
                    videoFinished = true
                    closeWriter()
                }
                break;
            }
        }
    }
}

Here is how to use it if you're compressing a URL. The compressedURL is returned inside the call back:

@IBAction func buttonTapped(sender: UIButton) {

    // show activity indicator

    let videoURL = URL(string: "...")

    compressFile(videoURL) { (compressedURL) in

       // remove activity indicator
       // do something with the compressedURL such as sending to Firebase or playing it in a player on the *main queue*
    }
}

FYI, I notice the audio slows things up quite a bit, you also try this on a background task to see if it runs any faster. If you added anything like an alert inside the compressFile function itself, you will have to show it on the mainQueue or the app will crash.

DispatchQueue.global(qos: .background).async { [weak self] in

    self?.compressFile(videoURL) { (compressedURL) in

        DispatchQueue.main.async { [weak self] in
            // also remove activity indicator on mainQueue in addition to whatever is inside the function itself that needs to be updated on the mainQueue
        }
    }
}

Here is how to do it if you're compressing a mix composition. You will need to use an AVMutableComposition, an AVAssetExportSession, and the compressFile(:completion:) function above:

@IBAction func buttonTapped(sender: UIButton) {

    // show activity indicator

    let mixComposition = AVMutableComposition()
    // code to create mix ...

    // create a local file
    let tempDir = NSTemporaryDirectory()
    let dirPath = "\(tempDir)/videos_\(UUID().uuidString).mp4"
    let outputFileURL = URL(fileURLWithPath: dirPath)

    removeUrlFromFileManager(outputFileURL) // check to see if the file already exists, if it does remove it, code is at the bottom of the answer

    createAssetExportSession(mixComposition, outputFileURL)
}

// here is the AssetExportSession function with the compressFile(:completion:) inside the callback
func createAssetExportSession(_ mixComposition: AVMutableComposition, _ outputFileURL: URL) {
    
    // *** If your video/url doesn't have sound (not mute but literally no sound, my iPhone's mic was broken when I recorded the video), change this to use AVAssetExportPresetPassthrough instead of HighestQulity. When my video didn't have sound the exporter.status kept returning .failed *** You can check for sound using https://stackoverflow.com/a/64733623/4833705
    guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else {
        // alert user there is a problem
        return
    }
    
    exporter.outputURL = outputFileURL
    exporter.outputFileType = AVFileType.mp4
    exporter.shouldOptimizeForNetworkUse = true
    exporter.exportAsynchronously {
        
        switch exporter.status {
        case .completed:
            print("completed")
            // view the AssetExportSession file size using HighestQuality which will be very high
            do {
                let data = try Data(contentsOf: outputFileURL)
                print("createAssetExportSession -file size: \(Double(data.count / 1048576)) mb")
            } catch let err as NSError {
                print("createAssetExportSession Error: \(err.localizedDescription)")
            }
        case .failed:
            print("failed:", exporter.error as Any)
            DispatchQueue.main.async { [weak self] in
                // remove activity indicator
                // alert user there is a problem
            }
            return
        case .cancelled:
            print("cancelled", exporter.error as Any)
            DispatchQueue.main.async { [weak self] in
                // remove activity indicator
                // alert user there is a problem
            }
            return
        default:
            print("complete")
        }
        
        guard let exporterOutputURL = exporter.outputURL else {
            // alert user there is a problem
            return
        }

        DispatchQueue.main.async { [weak self] in
            
            self?.compressFile(exporterOutputURL) { (compressedURL) in
               // remove activity indicator
               // do something with the compressedURL such as sending to Firebase or playing it in a player on the *main queue*
            }
        }
    }
}

Make sure to remove the compressedURL from file system after you are done with it, eg like before dismissing the vc

func dismissVC() {

    removeUrlFromFileManager(compressedURL)
    // dismiss vc ...
}

removeUrlFromFileManager(_ outputFileURL: URL?) {
    if let outputFileURL = outputFileURL {
        
        let path = outputFileURL.path
        if FileManager.default.fileExists(atPath: path) {
            do {
                try FileManager.default.removeItem(atPath: path)
                print("url SUCCESSFULLY removed: \(outputFileURL)")
            } catch {
                print("Could not remove file at url: \(outputFileURL)")
            }
        }
    }
}
Lance Samaria
  • 17,576
  • 18
  • 108
  • 256
  • wow, this worked great. using Preset Highest and videoQuality high, I managed to reduce from 26 mb to 2 mb using 750,000 bitrate. quality still seems fine, even better than Preset Medium (I guess?), and size drop is insane. Thanks for sharing! – Onur Çevik Aug 07 '20 at 20:59
  • 4
    np, we all need to help each other. Cheers! – Lance Samaria Aug 07 '20 at 21:07
  • @LanceSamaria Thanks, work like a charm! One trifling issue with the code is that you used self.bitRate, and let bitrate: NSNumber. So it has an error with compiling. – David Dec 04 '20 at 11:07
  • 1
    @LanceSamaria it was just a camelCasing issue, with using bitRate instead of bitrate, that declared in the file – David Dec 04 '20 at 11:43
  • @DavidKyslenko thanks for pointing out the camel casing issue, I made the update. Cheers!!! – Lance Samaria Dec 04 '20 at 12:05
  • Can you help me I want to compress my audio? – Yogesh Patel Apr 06 '21 at 15:54
  • 1
    @YogeshPatel I never dealt with only audio before but I would think you would use the same exact code from above but exclude anything that has to do with video such as `videoFinished`, `videoTrack`, `videoReaderSettings`, `assetReaderVideoOutput`, `videoSettings`, `videoInput`, and `videoInputQueue`. If you remove those `variables` from the file then it should still work. Message me if it doesn't. – Lance Samaria Apr 06 '21 at 16:47
  • Hey, could you please look once? I added a question regarding this. https://stackoverflow.com/q/66972501/8201581 Thanks! – Yogesh Patel Apr 06 '21 at 20:17
  • @YogeshPatel You didn't try any of the code from my answer. You have to try the code and **exclude** the parts that I mentioned above and see what happens. Everything is explained on what to do in the answer from the code itself to how to use it. The question you added is closed – Lance Samaria Apr 06 '21 at 20:42
  • The question was closed because I hadn't added a good question and once I edit that and share the link with you. I tried this code it not reducing my audio storing the same size. I'll try one more time and let you know. Thanks! – Yogesh Patel Apr 07 '21 at 07:37
  • 1
    @YogeshPatel you must be doing something wrong. I just did exactly what I told you to do earlier, **exclude anything that says video**. I just tried an audio file from here https://www.ee.columbia.edu/~dpwe/sounds/music/, the **africa-toto.wav** file. It was **12.0 mb** before compression and then **3.0 mb** after compression. I used the same exact code from the answer and just **excluded** everything that I told you to exclude. – Lance Samaria Apr 07 '21 at 08:07
  • As per your comment I am doing same thing but facing one issue. Constant 'assetReaderAudioOutput' captured by a closure before being initialized Could you help me with this because AVAssetReaderTrackOutput has no init method? – Yogesh Patel Apr 07 '21 at 08:09
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/230834/discussion-between-lance-samaria-and-yogesh-patel). – Lance Samaria Apr 07 '21 at 08:10
  • I'm using the similar method, the compression and quality is really good. But it's really slow. It will take a few seconds (5s or more) to compress a 30s video. Any comments? – Bagusflyer Apr 15 '21 at 04:19
  • @Bagusflyer I don’t understand what you’re asking me. If you’re saying you’re using a similar method then that means it’s a different method then the one that I posted. How can I comment on something else? – Lance Samaria Apr 15 '21 at 04:23
  • @LanceSamaria Sorry, I asked the wrong person. Actually I'm using `AVAssetWriter` which is really slow. It took about 30s to compress a 500mb video file. By the way, I tried your code `compressFile`, it's also very slow. Any idea? – Bagusflyer Apr 15 '21 at 13:40
  • 1
    @Bagusflyer try to lower the bit as low as possible. There has to be a tradeoff. Get a fast compression with crappy quality or great quality with a slow compression. I notice the audio slows things up quite a bit, you can also try it on a background task but if you have anything inside the actual function itself like an alert make sure to update it and anything else on the mainQueue. Btw if the answer works please upvote it. If it doesn't work and you can find an answer that gives you fast compression and great results please post it. I would use that myself. – Lance Samaria Apr 15 '21 at 14:13
  • @Bagusflyer like this: `DispatchQueue.global(qos: .background).async { [weak self] in self?.compressFile(videoURL) { [weak self] (compressedURL) in } }`, but again inside the actual `compressFile` itself you have to update anything such as an alert on the mainQueue or the app will crash – Lance Samaria Apr 15 '21 at 14:20
9

Erik Wegener code rewrited to swift 3:

class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: @escaping () -> ()) {
            //setup video writer
            let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
            let videoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
            let videoSize = videoTrack.naturalSize
            let videoWriterCompressionSettings = [
                AVVideoAverageBitRateKey : Int(125000)
            ]

            let videoWriterSettings:[String : AnyObject] = [
                AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
                AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
                AVVideoWidthKey : Int(videoSize.width) as AnyObject,
                AVVideoHeightKey : Int(videoSize.height) as AnyObject
            ]

            let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
            videoWriterInput.expectsMediaDataInRealTime = true
            videoWriterInput.transform = videoTrack.preferredTransform
            let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileTypeQuickTimeMovie)
            videoWriter.add(videoWriterInput)
            //setup video reader
            let videoReaderSettings:[String : AnyObject] = [
                kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
            ]

            let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
            let videoReader = try! AVAssetReader(asset: videoAsset)
            videoReader.add(videoReaderOutput)
            //setup audio writer
            let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
            audioWriterInput.expectsMediaDataInRealTime = false
            videoWriter.add(audioWriterInput)
            //setup audio reader
            let audioTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
            let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
            let audioReader = try! AVAssetReader(asset: videoAsset)
            audioReader.add(audioReaderOutput)
            videoWriter.startWriting()





            //start writing from video reader
            videoReader.startReading()
            videoWriter.startSession(atSourceTime: kCMTimeZero)
            let processingQueue = DispatchQueue(label: "processingQueue1")
            videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                while videoWriterInput.isReadyForMoreMediaData {
                    let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
                    if videoReader.status == .reading && sampleBuffer != nil {
                        videoWriterInput.append(sampleBuffer!)
                    }
                    else {
                        videoWriterInput.markAsFinished()
                        if videoReader.status == .completed {
                            //start writing from audio reader
                            audioReader.startReading()
                            videoWriter.startSession(atSourceTime: kCMTimeZero)
                            let processingQueue = DispatchQueue(label: "processingQueue2")
                            audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                                while audioWriterInput.isReadyForMoreMediaData {
                                    let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                                    if audioReader.status == .reading && sampleBuffer != nil {
                                        audioWriterInput.append(sampleBuffer!)
                                    }
                                    else {
                                        audioWriterInput.markAsFinished()
                                        if audioReader.status == .completed {
                                            videoWriter.finishWriting(completionHandler: {() -> Void in
                                                onDone();
                                            })
                                        }
                                    }
                                }
                            })
                        }
                    }
                }
            })
        }
parth
  • 853
  • 11
  • 18
8

You can set the video quality when you want to open UIImagePickerController to any one of the following :

UIImagePickerControllerQualityType640x480
UIImagePickerControllerQualityTypeLow
UIImagePickerControllerQualityTypeMedium
UIImagePickerControllerQualityTypeHigh
UIImagePickerControllerQualityTypeIFrame960x540
UIImagePickerControllerQualityTypeIFrame1280x720

Try this code for changing quality type when open the UIImagePickerController :

if (([UIImagePickerController isSourceTypeAvailable:
      UIImagePickerControllerSourceTypeCamera] == NO))
    return NO;
UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];

cameraUI.allowsEditing = NO;
cameraUI.delegate = self;
cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;//you can change the quality here
[self presentModalViewController:cameraUI animated:YES]; 
Mehdi
  • 974
  • 1
  • 10
  • 24
  • I've already tried UIImagePickerControllerQualityType. It doesn't work because putting the quality to medium or low changes he video aspect ratio...I want a way to decrease the size of a 720p video, not reduce a 720p video to 360p. – zakdances Aug 07 '12 at 17:51
4

Swift 4:

func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, completion: @escaping (Bool) -> Void) {

    let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
    let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
        AVVideoWidthKey : Int(videoSize.width) as AnyObject,
        AVVideoHeightKey : Int(videoSize.height) as AnyObject
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov)
    videoWriter.add(videoWriterInput)
    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    var videoReader: AVAssetReader!

    do{

        videoReader = try AVAssetReader(asset: videoAsset)
    }
    catch {

        print("video reader error: \(error)")
        completion(false)
    }
    videoReader.add(videoReaderOutput)
    //setup audio writer
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.add(audioReaderOutput)
    videoWriter.startWriting()

    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSession(atSourceTime: kCMTimeZero)
    let processingQueue = DispatchQueue(label: "processingQueue1")
    videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
        while videoWriterInput.isReadyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .reading && sampleBuffer != nil {
                videoWriterInput.append(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSession(atSourceTime: kCMTimeZero)
                    let processingQueue = DispatchQueue(label: "processingQueue2")
                    audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                        while audioWriterInput.isReadyForMoreMediaData {
                            let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .reading && sampleBuffer != nil {
                                audioWriterInput.append(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .completed {
                                    videoWriter.finishWriting(completionHandler: {() -> Void in
                                        completion(true)
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}
Sarwar Jahan
  • 63
  • 10
  • is working well, but crashes if the video is not having audio because of this part of the code ` let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]`. Any idea how can be fixed to work on video without audio? – Toto Feb 26 '20 at 12:17
3

Erik Wegener code rewrited to swift:

class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: () -> ()) {
    //setup video writer
    let videoAsset = AVURLAsset(URL: inputURL, options: nil)
    let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings,
        AVVideoWidthKey : Int(videoSize.width),
        AVVideoHeightKey : Int(videoSize.height)
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(URL: outputURL, fileType: AVFileTypeQuickTimeMovie)
    videoWriter.addInput(videoWriterInput)
    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    let videoReader = try! AVAssetReader(asset: videoAsset)
    videoReader.addOutput(videoReaderOutput)
    //setup audio writer
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.addInput(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.addOutput(audioReaderOutput)
    videoWriter.startWriting()





    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSessionAtSourceTime(kCMTimeZero)
    let processingQueue = dispatch_queue_create("processingQueue1", nil)
    videoWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in
        while videoWriterInput.readyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .Reading && sampleBuffer != nil {
                videoWriterInput.appendSampleBuffer(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .Completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSessionAtSourceTime(kCMTimeZero)
                    let processingQueue = dispatch_queue_create("processingQueue2", nil)
                    audioWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in
                        while audioWriterInput.readyForMoreMediaData {
                            let sampleBuffer:CMSampleBufferRef? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .Reading && sampleBuffer != nil {
                                audioWriterInput.appendSampleBuffer(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .Completed {
                                    videoWriter.finishWritingWithCompletionHandler({() -> Void in
                                        onDone();
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}
3
Use exportSession.fileLengthLimit = 1024 * 1024 * 10 //10 MB

10MB is hard coded number. Use according to your required bitrate.

fileLengthLimit

session should not exceed. Depending on the content of the source asset, it is possible for the output to slightly exceed the file length limit. The length of the output file should be tested if you require that a strict limit be observed before making use of the output. See also maxDuration and timeRange.Indicates the file length that the output of the

developer.apple.com/documentation/avfoundation/avassetexportsession/1622333-filelengthlimit

Kumar
  • 1,882
  • 2
  • 27
  • 44
1

There is an awesome custom class(SDAVAssetExportSession) to do the video compression. You can download it from this link.

After downloading add SDAVAssetExportSession.h and SDAVAssetExportSession.m files into your project, Then use below code to do the compression. In below code you can compress video by specifying resolution and bitrate

#import "SDAVAssetExportSession.h"


- (void)compressVideoWithInputVideoUrl:(NSURL *) inputVideoUrl
{
    /* Create Output File Url */

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    NSString *finalVideoURLString = [documentsDirectory stringByAppendingPathComponent:@"compressedVideo.mp4"];
    NSURL *outputVideoUrl = ([[NSURL URLWithString:finalVideoURLString] isFileURL] == 1)?([NSURL URLWithString:finalVideoURLString]):([NSURL fileURLWithPath:finalVideoURLString]); // Url Should be a file Url, so here we check and convert it into a file Url


    SDAVAssetExportSession *compressionEncoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:inputVideoUrl]]; // provide inputVideo Url Here
    compressionEncoder.outputFileType = AVFileTypeMPEG4;
    compressionEncoder.outputURL = outputVideoUrl; //Provide output video Url here
    compressionEncoder.videoSettings = @
    {
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: @800,   //Set your resolution width here
    AVVideoHeightKey: @600,  //set your resolution height here
    AVVideoCompressionPropertiesKey: @
        {
        AVVideoAverageBitRateKey: @45000, // Give your bitrate here for lower size give low values
        AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
        },
    };
    compressionEncoder.audioSettings = @
    {
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
    };

    [compressionEncoder exportAsynchronouslyWithCompletionHandler:^
     {
         if (compressionEncoder.status == AVAssetExportSessionStatusCompleted)
         {
            NSLog(@"Compression Export Completed Successfully");
         }
         else if (compressionEncoder.status == AVAssetExportSessionStatusCancelled)
         {
             NSLog(@"Compression Export Canceled");
         }
         else
         {
              NSLog(@"Compression Failed");

         }
     }];

}

To Cancel Compression Use Below Line Of code

 [compressionEncoder cancelExport]; //Video compression cancel
arunjos007
  • 4,105
  • 1
  • 28
  • 43
1

I am supporting etayluz's answer SDAVAssetExportSession is an awesome custom class to do the video compression. Here is my worked code. You can download SDAVAssetExportSession from this link.

After downloading add SDAVAssetExportSession.h and SDAVAssetExportSession.m files into your project, Then use below code to do the compression. In below code you can compress video by specifying resolution and bitrate

#import "SDAVAssetExportSession.h"


- (void)compressVideoWithInputVideoUrl:(NSURL *) inputVideoUrl
{
    /* Create Output File Url */

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    NSString *finalVideoURLString = [documentsDirectory stringByAppendingPathComponent:@"compressedVideo.mp4"];
    NSURL *outputVideoUrl = ([[NSURL URLWithString:finalVideoURLString] isFileURL] == 1)?([NSURL URLWithString:finalVideoURLString]):([NSURL fileURLWithPath:finalVideoURLString]); // Url Should be a file Url, so here we check and convert it into a file Url


    SDAVAssetExportSession *compressionEncoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:inputVideoUrl]]; // provide inputVideo Url Here
    compressionEncoder.outputFileType = AVFileTypeMPEG4;
    compressionEncoder.outputURL = outputVideoUrl; //Provide output video Url here
    compressionEncoder.videoSettings = @
    {
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoWidthKey: @800,   //Set your resolution width here
    AVVideoHeightKey: @600,  //set your resolution height here
    AVVideoCompressionPropertiesKey: @
        {
        AVVideoAverageBitRateKey: @45000, // Give your bitrate here for lower size give low values
        AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
        },
    };
    compressionEncoder.audioSettings = @
    {
    AVFormatIDKey: @(kAudioFormatMPEG4AAC),
    AVNumberOfChannelsKey: @2,
    AVSampleRateKey: @44100,
    AVEncoderBitRateKey: @128000,
    };

    [compressionEncoder exportAsynchronouslyWithCompletionHandler:^
     {
         if (compressionEncoder.status == AVAssetExportSessionStatusCompleted)
         {
            NSLog(@"Compression Export Completed Successfully");
         }
         else if (compressionEncoder.status == AVAssetExportSessionStatusCancelled)
         {
             NSLog(@"Compression Export Canceled");
         }
         else
         {
              NSLog(@"Compression Failed");

         }
     }];

}

To Cancel Compression Use Below Line Of code

 [compressionEncoder cancelExport]; //Video compression cancel
arunjos007
  • 4,105
  • 1
  • 28
  • 43
  • 1
    This code is great! But why does it takes too much time to compress a 20mb video? If i do the same compression on apps such as facebook, its immediately. What could be the problem? Thanks! – Joaquin Pereira Sep 11 '17 at 04:45
0

Swift 5

More bitrate equals more quality and more space.

Improved version of https://stackoverflow.com/a/62862102/7668778

Works with videos from iCloud and also works with multiple videos at the same time. It's also possible to change the progress to a block parameter, instead of a delegate.

import Foundation
import AVKit

protocol VideoEditorDelegate: AnyObject {
    
    func propagate(event: VideoEditor.PropagateEvent)
    
}

final class VideoEditor {
    
    var assetReaders: [AVAssetReader] = []
    var assetWriters: [AVAssetWriter] = []
    
    /// Compress a video URL to H264 in mp4 format
    /// - Parameters:
    ///   - asset: video asset to compress
    ///   - uuID: unique id to identify progress of each video
    ///   - delegate: to handle progress
    ///   - completion: new compressed video url
    func compressVideo(_ asset: AVAsset,
                       uuID: String?,
                       delegate: VideoEditorDelegate?,
                       bitrate: NSNumber = NSNumber(value: 6000000),
                       completion: @escaping (Result<URL, Error>) -> Void) {
        DispatchQueue.global().async {
            //Patch for iCloud videos
            guard let videoData = asset.dataToUpload,
                  let asset = self.avAssetFrom(data: videoData) else {
                completion(.failure(Errors.nilAVAssetData))
                return
            }
            
            guard let reader = try? AVAssetReader(asset: asset) else {
                completion(.failure(Errors.nilAssetReader))
                return
            }
            self.assetReaders.append(reader) //To prevent reader being release while reading
            guard let videoTrack = asset.tracks(withMediaType: .video).first else {
                completion(.failure(Errors.nilVideoTrack))
                return
            }
            let videoReaderSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32ARGB]
            let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack,
                                                                  outputSettings: videoReaderSettings)
            
            guard let audioTrack = asset.tracks(withMediaType: .audio).first else {
                completion(.failure(Errors.failToAddAudio))
                return
            }
            
            let audioReaderSettings: [String : Any] = [
                AVFormatIDKey: kAudioFormatLinearPCM,
                AVSampleRateKey: 44100,
                AVNumberOfChannelsKey: 2
            ]
            let assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack,
                                                                  outputSettings: audioReaderSettings)
            guard reader.canAdd(assetReaderAudioOutput) else {
                completion(.failure(Errors.failToAddAudio))
                return
            }
            reader.add(assetReaderAudioOutput)
            
            guard reader.canAdd(assetReaderVideoOutput) else {
                completion(.failure(Errors.failToAddVideo))
                return
            }
            reader.add(assetReaderVideoOutput)
            
            let videoSettings: [String : Any] = [
                AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: bitrate],
                AVVideoCodecKey: AVVideoCodecType.h264,
                AVVideoHeightKey: videoTrack.naturalSize.height,
                AVVideoWidthKey: videoTrack.naturalSize.width,
                AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill
            ]
            
            let audioSettings: [String:Any] = [AVFormatIDKey : kAudioFormatMPEG4AAC,
                                       AVNumberOfChannelsKey : 2,
                                             AVSampleRateKey : 44100.0,
                                          AVEncoderBitRateKey: 128000
            ]
            
            let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio,
                                                outputSettings: audioSettings)
            let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video,
                                                outputSettings: videoSettings)
            videoInput.transform = videoTrack.preferredTransform
            
            let videoInputQueue = DispatchQueue(label: "videoQueue")
            let audioInputQueue = DispatchQueue(label: "audioQueue")
            
            let formatter = DateFormatter()
            formatter.dateFormat = "yyyy'-'MM'-'dd'T'HH':'mm':'ss'Z'"
            let date = Date()
            let tempDir = NSTemporaryDirectory()
            let outputPath = "\(tempDir)/\(formatter.string(from: date)).mp4"
            let outputURL = URL(fileURLWithPath: outputPath)
            
            guard let writer = try? AVAssetWriter(outputURL: outputURL,
                                                  fileType: AVFileType.mp4) else {
                completion(.failure(Errors.nilAssetWriter))
                return
            }
            self.assetWriters.append(writer) //To prevent writer being release while writing
            writer.shouldOptimizeForNetworkUse = true
            writer.add(videoInput)
            writer.add(audioInput)
            
            writer.startWriting()
            reader.startReading()
            writer.startSession(atSourceTime: CMTime.zero)
            
            let group = DispatchGroup()

            group.enter()
            audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
                while(audioInput.isReadyForMoreMediaData) {
                    if let cmSampleBuffer = assetReaderAudioOutput.copyNextSampleBuffer() {
                        audioInput.append(cmSampleBuffer)
                    } else {
                        audioInput.markAsFinished()
                        group.leave()
                    }
                }
            }

            group.enter()
            let videoLenth = CMTimeGetSeconds(asset.duration)
            videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
                while(videoInput.isReadyForMoreMediaData) {
                    if let cmSampleBuffer = assetReaderVideoOutput.copyNextSampleBuffer() {
                        videoInput.append(cmSampleBuffer)
                        //Show progress
                        if let uuID = uuID, let delegate = delegate {
                            let timeStamp = CMSampleBufferGetPresentationTimeStamp(cmSampleBuffer)
                            let timeInSecond = CMTimeGetSeconds(timeStamp)
                            let progress = Float(timeInSecond / videoLenth)
                            DispatchQueue.main.async {
                                delegate.propagate(event: .progress(progress,
                                                                     uuID: uuID))
                            }
                        }
                    } else {
                        videoInput.markAsFinished()
                        group.leave()
                    }
                }
            }

            let closeWriter: () -> Void = {
                Task {
                    await writer.finishWriting()
                    do {
                        let data = try Data(contentsOf: writer.outputURL)
                        //TODO ale: track file size
                        print("compressFile -file size after compression: \(Double(data.count / 1048576)) mb")
                    } catch {
                        completion(.failure(error))
                        return
                    }
                    completion(.success(writer.outputURL))
                    writer.cancelWriting()
                    self.assetWriters = self.assetWriters.filter { $0.outputURL != outputURL }
                    self.assetReaders = self.assetReaders.filter { $0.asset != asset }
                }
            }

            group.notify(queue: .global()) {
                closeWriter()
            }
        }
    }
    
    func avAssetFrom(data: Data) -> AVAsset? {
            let directory = NSTemporaryDirectory()
            let fileName = "\(NSUUID().uuidString).mov"
        guard let fullURL = NSURL.fileURL(withPathComponents: [directory, fileName]) else {
            return nil
        }
        do {
            try data.write(to: fullURL)
            let asset = AVAsset(url: fullURL)
            return asset
        } catch {
            return nil
        }
    }
    
}

// MARK: = Helpong Structures

extension VideoEditor {
    
    enum PropagateEvent {
        case progress(Float, uuID: String)
    }
    
    enum Errors: Error {
        case nilAssetReader
        case nilAssetWriter
        case failToAddAudio
        case failToAddVideo
        case nilVideoTrack
        case nilAVAssetData
    }
    
}
alegelos
  • 2,308
  • 17
  • 26