TL;DR
Here's the code to store Live Photos and upload them to a server:
1. Capturing Live Photo
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
if (error) {
[self raiseError:error];
return;
}
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *image = [CIImage imageWithData:imageData];
[self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
[self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file
}
}
expectedAsset
is just an object holding all required information. You can use a NSDictionary instead. And since this code snippet is a >= iOS 11 API, heres the one for "deprecated" iOS...
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:[photo metadata]];
[self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
}
}
#pragma clang diagnostic pop
2. Generate NSData
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}
Long Answer
This is caused by wrong Metadata in the video/image file.
When creating a live photo, PHLivePhoto searches for the key 17 in kCGImagePropertyMakerAppleDictionary
(which is the asset identifier) and matches this with the com.apple.quicktime.content.identifier
of the mov file. The mov file also needs to have an entry for the time where the still image was captured (com.apple.quicktime.still-image-time
).
Make sure your files haven't been edited (or exported) somewhere. Event the UIImageJPEGRepresentation function will remove this data from the image.
Here's a code snippet I'm using to convert the UIImage to NSData:
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
The Handler gets called twice to first tell you about corrupt data, and the second time about the cancellation of the process (these are two different keys).
EDIT:
Here's your mov data:
$ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2018-01-27T11:07:38.000000Z
com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816
Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
Metadata:
creation_time : 2018-01-27T11:07:38.000000Z
handler_name : Core Media Data Handler
encoder : 'avc1'
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
Metadata:
creation_time : 2018-01-27T11:07:38.000000Z
handler_name : Core Media Data Handler
The com.apple.quicktime.still-image-time
key is missing here.
Here's the metadata how it should look like:
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2017-12-15T12:41:00.000000Z
com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810
com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/
com.apple.quicktime.make: Apple
com.apple.quicktime.model: iPhone X
com.apple.quicktime.software: 11.1.2
com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100
Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default)
Metadata:
rotate : 90
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
encoder : H.264
Side data:
displaymatrix: rotation of -90.00 degrees
Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
And just FYI, heres your JPEG Data:
$ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg
exif:ColorSpace=1
exif:ExifImageLength=960
exif:ExifImageWidth=540
exif:ExifOffset=26
exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0