3

I am trying to load a jpg image together with a mov file with objective-c on ios device to display a live photo, and I make following code snippet to do that in viewDidLoad function:

- (void)viewDidLoad {
    [super viewDidLoad];

    PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];

    NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"jpg"];
    NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"mov"];

    [PHLivePhoto requestLivePhotoWithResourceFileURLs:@[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:@"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
        NSLog(@"we are in handler");
        photoView.livePhoto = livePhoto;
        photoView.contentMode = UIViewContentModeScaleAspectFit;
        photoView.tag = 87;
        [self.view addSubview:photoView];
        [self.view sendSubviewToBack:photoView];
    }];


}

I have drag the file livePhoto.jpg and livePhoto.mov to Xcode project

But when build this Xcode log this error:

2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler

Any idea about that? Thanks.

And another thing to ask:

Why does the resultHandler was called twice according to what is printed?

armnotstrong
  • 8,605
  • 16
  • 65
  • 130

1 Answers1

2

TL;DR

Here's the code to store Live Photos and upload them to a server:
1. Capturing Live Photo

- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
    if (error) {
        [self raiseError:error];
        return;
    }
    NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
    CIImage *image = [CIImage imageWithData:imageData];
    [self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
    [self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output 
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
    if (error) {
        [self raiseError:error];
    } else {
        [self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file
    }
}

expectedAsset is just an object holding all required information. You can use a NSDictionary instead. And since this code snippet is a >= iOS 11 API, heres the one for "deprecated" iOS...

#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
    if (error) {
        [self raiseError:error];
    } else {
        [self.expectedAsset addInput:[photo metadata]];
        [self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
    }
}
#pragma clang diagnostic pop 


2. Generate NSData

- (NSData*)imageData {
        NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
        CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
        NSMutableData *dest_data = [NSMutableData data];
        CFStringRef uti = CGImageSourceGetType(source);
        NSMutableDictionary *maker = [NSMutableDictionary new];
        [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];    // imageMetadata is the dictionary form step 1 above
        CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
        CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
        CGImageDestinationFinalize(destination);
        return dest_data;
    }

- (void)dataRepresentation:(DataRepresentationLoaded)callback {
    callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}

Long Answer

This is caused by wrong Metadata in the video/image file. When creating a live photo, PHLivePhoto searches for the key 17 in kCGImagePropertyMakerAppleDictionary (which is the asset identifier) and matches this with the com.apple.quicktime.content.identifier of the mov file. The mov file also needs to have an entry for the time where the still image was captured (com.apple.quicktime.still-image-time).

Make sure your files haven't been edited (or exported) somewhere. Event the UIImageJPEGRepresentation function will remove this data from the image.

Here's a code snippet I'm using to convert the UIImage to NSData:

- (NSData*)imageData {
    NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
    CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
    NSMutableData *dest_data = [NSMutableData data];
    CFStringRef uti = CGImageSourceGetType(source);
    NSMutableDictionary *maker = [NSMutableDictionary new];
    [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
    CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
    CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
    CGImageDestinationFinalize(destination);
    return dest_data;
}

The Handler gets called twice to first tell you about corrupt data, and the second time about the cancellation of the process (these are two different keys).

EDIT:

Here's your mov data:


    $ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov
    Metadata:
        major_brand     : qt  
        minor_version   : 0
        compatible_brands: qt  
        creation_time   : 2018-01-27T11:07:38.000000Z
        com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816
      Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s
        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
        Metadata:
          creation_time   : 2018-01-27T11:07:38.000000Z
          handler_name    : Core Media Data Handler
          encoder         : 'avc1'
        Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
        Metadata:
          creation_time   : 2018-01-27T11:07:38.000000Z
          handler_name    : Core Media Data Handler

The com.apple.quicktime.still-image-time key is missing here.

Here's the metadata how it should look like:


    Metadata:
        major_brand     : qt  
        minor_version   : 0
        compatible_brands: qt  
        creation_time   : 2017-12-15T12:41:00.000000Z
        com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810
        com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/
        com.apple.quicktime.make: Apple
        com.apple.quicktime.model: iPhone X
        com.apple.quicktime.software: 11.1.2
        com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100
      Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s
        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default)
        Metadata:
          rotate          : 90
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
          encoder         : H.264
        Side data:
          displaymatrix: rotation of -90.00 degrees
        Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
        Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
        Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler

And just FYI, heres your JPEG Data:


    $ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg
    exif:ColorSpace=1
    exif:ExifImageLength=960
    exif:ExifImageWidth=540
    exif:ExifOffset=26
    exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0

engel94
  • 21
  • 4
  • thanks for your answer, now I found that I will get an error complaining that the metadata of the video is wrong even I set everything right (the `17` key of the image and the `com.apple.quicktime.content.identifier` of the video) – armnotstrong Feb 08 '18 at 04:25
  • JPEG seems fine since I have tested the JPEG file with other `.mov` file and they could work together – armnotstrong Feb 08 '18 at 08:44
  • Can you upload both files? – engel94 Feb 08 '18 at 08:46
  • sure, the image is [here](https://www.dropbox.com/s/3gpxduffzu9861k/cf70b7de66bd89654967aeef1d557816.jpg?dl=0) and the video is [here](https://www.dropbox.com/s/6dg1f7xjcs8iltl/cf70b7de66bd89654967aeef1d557816.mov?dl=0) – armnotstrong Feb 08 '18 at 08:54
  • edited my answer above. How did you get the mov file to your computer? the key maybe got lost while exporting – engel94 Feb 08 '18 at 11:00
  • thanks for your answer, but I think the metadata of `jpg` and `.mov` is just fine, you could see the `17` metadata of the jpg using [this](http://exif.regex.info/exif.cgi) online tool (be noted to download the original image file rather than the preview one from the link I provide) – armnotstrong Feb 08 '18 at 13:18
  • JPG metadata is fine, I think its the mov. Just noticed the mov duration is 15s - isnt that way to long for an live photo? Actually not sure about this. – engel94 Feb 08 '18 at 13:23
  • Hate that apple didn't give any more information about this – armnotstrong Feb 09 '18 at 02:47