20

I am trying to create a video file from images given by image magick library. After applying some effects one by one like opacity difference ,it iscreated successfully but the Quick time player gives the error " video file could not be opened. The movie's file format isn't recognized ".

I am using the following code :

double d = 0.00;

- (void)posterizeImageWithCompression:(id)sender {

    // Here we use JPEG compression.
    NSLog(@"we're using JPEG compression");

    MagickWandGenesis();
    magick_wand = NewMagickWand();
    magick_wand = [self magiWandWithImage:[UIImage imageNamed:@"iphone.png"]];

    MagickBooleanType status;

    status = MagickSetImageOpacity(magick_wand, d);

    if (status == MagickFalse) {
        ThrowWandException(magick_wand);
    }
    if (status == MagickFalse) {
        ThrowWandException(magick_wand);
    }
    size_t my_size;
    unsigned char * my_image = MagickGetImageBlob(magick_wand, &my_size);
    NSData * data = [[NSData alloc] initWithBytes:my_image length:my_size];
    free(my_image);
    magick_wand = DestroyMagickWand(magick_wand);
    MagickWandTerminus();
    UIImage * image = [[UIImage alloc] initWithData:data];

    d = d + 0.05;
    if (status == MagickFalse) {
        ThrowWandException(magick_wand);
    }

    NSData *data1;

    NSArray *paths;

    NSString *documentsDirectory,*imagePath ;

    UIImage *image1 = image;

    paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

    documentsDirectory = [paths objectAtIndex:0];

    imagePath = [documentsDirectory stringByAppendingPathComponent:[NSString   stringWithFormat:@"%f.png",d]];

    data1 = UIImagePNGRepresentation(image1);

    if (d <= 1.0 ) {

        [data1 writeToFile:imagePath atomically:YES];

        [imageViewButton setImage:image forState:UIControlStateNormal];

        // If ready to have more media data
         if (assetWriterPixelBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
             CVReturn cvErr = kCVReturnSuccess;
             // get screenshot image!
             CGImageRef image1 = (CGImageRef) image.CGImage;

             // prepare the pixel buffer
             CVPixelBufferRef pixelsBuffer = NULL;

             // Lock pixel buffer address
             CVPixelBufferLockBaseAddress(pixelsBuffer, 0);

            // pixelsBuffer = [self pixelBufferFromCGImage:image1];

             CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);

             CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image1));
             NSLog (@"copied image data");
             cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                  FRAME_WIDTH,
                                                  FRAME_HEIGHT,
                                                  kCVPixelFormatType_32BGRA,
                                                  (void*)CFDataGetBytePtr(imageData),
                                                  CGImageGetBytesPerRow(image1),
                                                  NULL,
                                                  NULL,
                                                  NULL,
                                                  &pixelsBuffer);
             NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

             // calculate the time
             CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
             CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
             NSLog (@"elapsedTime: %f", elapsedTime);
             CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);

             // write the sample
             BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:presentationTime];

             if (appended) {
                 NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
             } else {
                 NSLog (@"failed to append");
                 [self stopRecording];
             }

             // Release pixel buffer
             CVPixelBufferRelease(pixelsBuffer);
             CFRelease(imageData);
         }
     }


}

it also shows error like...

VideoToolbox`vt_Copy_32BGRA_2vuyITU601 + 91 and 
VideoToolbox`vtPixelTransferSession_InvokeBlitter + 574 and 
VideoToolbox`VTPixelTransferSessionTransferImage + 14369 and 
VideoToolbox`VTCompressionSessionEncodeFrame + 1077 and 
MediaToolbox`sbp_vtcs_processSampleBuffer + 599
Lou Franco
  • 87,846
  • 14
  • 132
  • 192
BADRI
  • 643
  • 8
  • 26
  • So you wants to create video from images? – Vishal Sharma Nov 16 '15 at 13:22
  • have you tried using ffmpeg for this feature? – Swati Nov 19 '15 at 10:12
  • I think it could be because of incorrect presentationTime. What us the value given to `TIME_SCALE` and what is the FPS you are expecting your final movie to be in. – Sumeet Dec 22 '15 at 15:15
  • Any particular reason to use Image Magick? If not, here's a better way to create a movie from images: http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie/3742212#3742212 –  Nov 30 '16 at 00:56

1 Answers1

0

The QT Player is not being very informative. Usually, players will provide the name of the missing codec. If you generated the file and played it back programmatically on the same computer as you were unsuccessful in QT playing it, then, for whatever reason, the codec the program library could obviously see (because it used it) is not registered with the OS. In the shell, you can do a "file " and get the file type and possibly the codec. If not, you should be able to find the codec with vlc, transcode, or gstreamer and then follow Apple's instructions in downloading and installing the needed codec directly into the OS.

Douglas Daseeco
  • 3,475
  • 21
  • 27