1

I am desperately fighting against this issue...

Background: I have an app that take 4 images in one shutter with different exposures, which has been set to AVCapturePhotoBracketSettings. I've converted CMSampleBufferRef to UIImage* and save them into my photo album by calling

UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);// image is UIImage*

But later, when I export the images to my mac, I realized that the exposure parameters are all gone checking by right-clicking the image -> Get Info. I have an NSArray keeps 4 exposures. I want to put them to the UIImage one for each so after saving them to my album, it could show the exposure time at Get Info, just like exactly the image in an album shows if I took it from the iphone build-in camera.


Problem: I have 3 length 4 NSMutableArrays. One stores 4 exposure numbers in float; The second has 4 CMSampleBufferRef image data; Last one has 4 UIImage * that converted from the second to display on the screen. All I need to do is to either save the UIImages with the exposure number to my album as jpeg, or save directly from CMSampleBufferRef, with my exposure time number to my album as jpeg (I don't know think this is possible though).


My findings: so, from this post and this official doc, I learned that actually UIImage does not contain metadata.

After a long research, I end up with two hits that seemingly located on right path.

One is using Photo Framework to set exposure as an asset of an image and add it to the album. This and this, and lots other posts helped in coding. However, I couldn't find any property that I could set exposures. The "Reading Asset Metadata" properties in PHAsset seems all I could modify, which doesn't have exposure time.

The other one is Image I/O -> CGImageProperties -> EXIF Dictionary Keys -> kCGImagePropertyExifExposureTime. CGImageDestinationSetProperties function could set it. But this is for write the image data to app instead of saving image to album... So I finally found a exposure time property, but I can't use it...


Did I miss something?? Because this seems very easy to do, my electrical background colleague looked at me as I am an idiot who spent a day long couldn't figure it out...


Update:

It seems like creationRequestForAssetFromImageAtFileURL: method will be able to store image with EXIF to album. This post(Metadata lost when saving photo using PHPhotoLibrary) indicated that it is better to temporarily save the image data with EXIF info, then call the method above.

I took the code from this post(Saving photo with metadata into Camera Roll with specific file name) and modified it.

create album:

PHFetchResult *normalAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];

NSLog(@"There are %lu albums fetched.", (unsigned long)normalAlbums.count);//output 0 I don't know why

_album = normalAlbums.firstObject;// this give me nil, but still can save image to the album, is this the cause of the problem?

Fetch EXIF info:

//imageSampleBufferArray is the type of CMSampleBufferRef that has the image    
CMSampleBufferRef sampleBuf = (CMSampleBufferRef)CFArrayGetValueAtIndex(imageSampleBufferArray, i);
NSDictionary* exifAttachments = (NSDictionary*)CMGetAttachment(sampleBuf, (CFStringRef)@"{Exif}", NULL);

Save image:

+(void)saveImageWithMetaToCameraRoll:(UIImage*)image album:(PHAssetCollection *)album exif:(NSDictionary *)metadata{

    NSMutableDictionary *meta = [metadata mutableCopy];
    NSData *imageData = UIImageJPEGRepresentation(image, 1.0f);
    CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);

    NSString *timestamp = [NSString stringWithFormat:@"%.0f", [[NSDate date] timeIntervalSince1970] * 1000];
    NSString *fileName = [NSString stringWithFormat:@"IMG_%@.jpeg", timestamp];

    NSURL *tmpURL = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:fileName]];

    CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef) tmpURL, kUTTypeJPEG, 1, NULL);
    CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) meta);

    CGImageDestinationFinalize(destination);
    CFRelease(source);
    CFRelease(destination);

    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        PHAssetChangeRequest *newAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromImageAtFileURL:tmpURL];
        PHObjectPlaceholder *placeholderAsset = newAssetRequest.placeholderForCreatedAsset;

        if (album) {
            PHAssetCollectionChangeRequest *changeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:album];

            [changeRequest addAssets:@[placeholderAsset]];
        }

    } completionHandler:^(BOOL success, NSError *error) {
        //Clean up the file:
        [[NSFileManager defaultManager] removeItemAtURL:tmpURL error:nil];

        if (error) {
            NSLog(@"%@", error);
        }

    }];
}

However, when I take a look at the image at album I've just taken, it still doesn't have any of EXIF information like exposure time. Still investigating it...

0 Answers0