6

I'm trying to get image from a video, then use this image to generate a still movie The first step works well, but the second step generated a malformed video after I set appliesPreferredTrackTransform=true

normal image extracted from the video normal image extracted from the video malformed video generated from the image malformed video generated from the image

How did this come? A normal image generated a malformed video? besides, if I put the GenerateMovieFromImage.generateMovieWithImage block in #2 the app will crash at CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image);

I did as below(in swift):

    var asset: AVAsset  = AVAsset.assetWithURL(self.tmpMovieURL!) as AVAsset
    var imageGen: AVAssetImageGenerator =  AVAssetImageGenerator(asset: asset)
    var time: CMTime = CMTimeMake(0, 60)
    imageGen.appliesPreferredTrackTransform = true
    imageGen.generateCGImagesAsynchronouslyForTimes( [ NSValue(CMTime:time) ], completionHandler: {

        (requestTime, image, actualTime, result, error) -> Void in
            if result == AVAssetImageGeneratorResult.Succeeded {



                ALAssetsLibrary().writeImageToSavedPhotosAlbum(image, metadata: nil, completionBlock: {
                    (nsurl, error) in
                       // #2                    
                })

                 GenerateMovieFromImage.generateMovieWithImage(image, completionBlock:{
                        (genMovieURL) in
                        handler(genMovieURL)

                })

The GenerateMovieFromImage.generateMovieWithImage was from This answer

+ (void)generateMovieWithImage:(CGImageRef)image completionBlock:(GenerateMovieWithImageCompletionBlock)handler
{

NSLog(@"%@", image);

NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent: [@"tmpgen" stringByAppendingPathExtension:@"mov"  ] ];

NSURL *videoUrl = [NSURL fileURLWithPath:path];

if ([[NSFileManager defaultManager] fileExistsAtPath:path] ) {
    NSError *error;
    if ([[NSFileManager defaultManager] removeItemAtPath:path error:&error] == NO) {
        NSLog(@"removeitematpath %@ error :%@", path, error);
    }
}


// TODO: image need to rotate programly, not in hand
int width = (int)CGImageGetWidth(image);
int height = (int)CGImageGetHeight(image);

NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:videoUrl
                               fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:width], AVVideoWidthKey,
                               [NSNumber numberWithInt:height], AVVideoHeightKey,
                               nil];
AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                    assetWriterInputWithMediaType:AVMediaTypeVideo
                                    outputSettings:videoSettings] ; //retain should be removed if ARC


NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:nil ];

//    2) Start a session:
NSLog(@"start session");

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero]; //use kCMTimeZero if unsure


dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{

     if ([writerInput isReadyForMoreMediaData]) {


         //    3) Write some samples:

         // Or you can use AVAssetWriterInputPixelBufferAdaptor.
         // That lets you feed the writer input data from a CVPixelBuffer
         // that’s quite easy to create from a CGImage.


         CVPixelBufferRef sampleBuffer = [self newPixelBufferFromCGImage:image];

         if (sampleBuffer) {
             CMTime frameTime = CMTimeMake(150,30);
            [adaptor appendPixelBuffer:sampleBuffer withPresentationTime:kCMTimeZero];
            [adaptor appendPixelBuffer:sampleBuffer withPresentationTime:frameTime];
             CFRelease(sampleBuffer);
         }
     }


    //    4) Finish the session:

    [writerInput markAsFinished];
    [videoWriter endSessionAtSourceTime:CMTimeMakeWithSeconds(5, 30.0) ] ; //optional can call finishWriting without specifiying endTime
    // [videoWriter finishWriting]; //deprecated in ios6
    NSLog(@"to finnish writing");

    [videoWriter finishWritingWithCompletionHandler:^{
        NSLog(@"%@",videoWriter);
        NSLog(@"finishWriting..");

        handler(videoUrl);

        ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
        [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:path] completionBlock: ^(NSURL *assetURL, NSError *error){
            if( error != nil) {
                NSLog(@"writeVideoAtPathToSavedPhotosAlbum error: %@" , error);
            }

        }];
    }]; //ios 6.0+

}];





}


+ (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef)image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;


CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image) );

NSLog(@"width:%f", frameSize.width);
NSLog(@"height:%f", frameSize.height);



CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                      frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options,
                                      &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                             frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
                                             (CGBitmapInfo)kCGImageAlphaNoneSkipFirst
                                             );

NSParameterAssert(context);



CGContextConcatCTM(context, CGAffineTransformIdentity);
CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}
Community
  • 1
  • 1
Alex Chan
  • 1,116
  • 3
  • 15
  • 33

3 Answers3

1

Try this:

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary ];

I find your problem:

(requestTime, image, actualTime, result, error) -> Void in
     if result == AVAssetImageGeneratorResult.Succeeded {

        let img : UIImage = UIImage(CGImage: image)! // retain 
        UIImageWriteToSavedPhotosAlbum(img,nil,nil,nil) // synchron

        GenerateMovieFromImage.generateMovieWithImage(image, completionBlock:{
                (genMovieURL) in
                        handler(genMovieURL)

         })

I check all works. If you still have problem, then problem is in ur device.

user1502383
  • 323
  • 1
  • 5
  • It does not work, too, I'm trying to use my ipad to test whether is my iphone's problem. But I think maybe the problem is the transform matrix of AVFoundation or CoreGraphics. – Alex Chan Dec 11 '14 at 12:05
  • Add this check: if (![videoWriter startWriting]) and check appendPixelBuffer result. You can see error like this videoWriter.error. – user1502383 Dec 11 '14 at 13:29
  • all return yes, no error logged. And I tried but cannot get the program run on my iPad(it can't jailbreak) – Alex Chan Dec 11 '14 at 16:46
1

It seems that I have get some improvement.

In function (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef)image

change :

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                     [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                     [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                     nil];

to:

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         [NSNumber numberWithInt:4*frameSize.width], kCVPixelBufferBytesPerRowAlignmentKey,nil];

The photo is not malformed. but it seems it's been scaled in its y-cord to 0.5 as its normal height!

I'm still working to solve this problem.


update:

I've totally solved this problem: This is because the attribute transform and preformTransform

The transform specified in the track’s storage container as the preferred 

transformation of the visual media data for display purposes.

This means that the real orientation of a video file may be not as you see when playing.

The AVPlayer use the transform in this file to play. However, the generateCGImagesAsynchronouslyForTimes will ignore this attribute. and get a landscape(for example) picture. so, will need to set the transform back to match to original video file.

Just add this before

writerInput.transform = CGAffineTransformMakeRotation(M_PI_2) ;

before

AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings] ; //retain should be removed if ARC

will solve this problem.

Alex Chan
  • 1,116
  • 3
  • 15
  • 33
  • Solved by multiple height with 2, but I don't know why: CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, frameSize.height*2, 8, 4*frameSize.width, rgbColorSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst ); CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height*2), image); – Alex Chan Dec 18 '14 at 03:40
  • The video encoders you're using may have width restrictions, e.g. width must be divisible two or four. Have you tried changing your width? – Rhythmic Fistman Feb 03 '15 at 19:14
1

I too got this error when I tried to create a video with an array of images and music file. It's because of the video frame ratio. So check the frame for video composition. For your reference: http://size43.com/jqueryVideoTool.html

Pushpa Raja
  • 642
  • 6
  • 17