2

my company is having a big problem with getting correct size metadata by fetching PHAssets. We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets. For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object. To make an example, we have a picture that:

  • is reported to be 2096x3724 by PHAsset object
  • but, when full size image is requested, a 1536x2730 picture is generated

The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2. This is the code to get full size image version:

PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;  

PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];

[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage *  result, NSDictionary *  info) {
    NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
    //UPLOAD OF imageData TO SERVER HERE
}]

Also tried with requestImageDataForAsset method, but with no luck:

PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;  

PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];

[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary *  info) {
    //UPLOAD OF imageData TO SERVER HERE
}]

Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.

Are we missing or doing something wrong? Is there a way to get asset size in pixel without loading full-resolution image into memory? Thanks for helping

Redarea
  • 21
  • 2
  • Any updates on this? I am having exact same issue. – Karthik May 17 '17 at 16:39
  • Actually, no updates. After many test, we ended up by sending to our server the 'supposed size (PHAsset pixelWidth and pixelSize), open the picture when it's uploaded and check again the real size, then adjust coordinates based on the real size – Redarea May 19 '17 at 15:38
  • Also had no answer from anyone in the Apple developer forum – Redarea May 19 '17 at 15:40
  • I ended up doing something similar, rather than doing it on backend like your case, i had to do it on app side, i.e. before uploading the image, fetch the max size, & check image size & adjust if needed before uploading. – Karthik May 25 '17 at 16:43

2 Answers2

1

This is due to a bug in Photos framework. Details about the bug can be found here.

Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.

Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).

The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.


Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (@available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
    if (res.type == PHAssetResourceTypeFullSizePhoto) {
        // from my tests so far, this is only present for edited photos
        hqResource = res;
        break;
    }
}

if (hqResource) {
    PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
    options.networkAccessAllowed = YES;
    long long fileSize = [[hqResource valueForKey:@"fileSize"] longLongValue];
    NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];

    [[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
        // append the data that we're receiving
        [fullData appendData:data];
    } completionHandler:^(NSError * _Nullable error) {
        // handle completion, using `fullData` or `error`
        // uti == hqResource.uniformTypeIdentifier
        // orientation == UIImageOrientationUp
    }];
}
else {
    // use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
alex-i
  • 5,406
  • 2
  • 36
  • 56
-1

can you try this to fetch camera Roll pics:

__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
    PHFetchOptions *options = [[PHFetchOptions alloc] init];
    options.wantsIncrementalChangeDetails = YES;
    options.predicate = [NSPredicate predicateWithFormat:@"mediaType == %d",PHAssetMediaTypeImage];
    options.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
    PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
    if(assets.count>0)
    {
        [assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
            if(asset!=nil)
            {
                [[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
                 {
                     dispatch_async(dispatch_get_main_queue(), ^{
                         [weakSelf addlocalNotificationForFilters:result];
                         // [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
                     });
                 }];
                *stop = YES;
            }
            else{
                [weakSelf getlatestAferSelfie];
            }
        }];
    }
Ajjjjjjjj
  • 669
  • 4
  • 12
  • Calling requestImageForAsset with PHImageManagerMaximumSize as targetSize implies getting full resolution image raw data, which is what we would like to avoid, 'cause our customers could select even 200 pictures together, and it could mean waiting for a long time to let processing finish. We'd like a method of getting real-size metadata without actually opening the high-resolution picture version – Redarea Apr 24 '17 at 14:21