4

I am uploading photos to a server with an IOS app. It is important that the photos are uploaded with no loss in quality and are uploaded as jpeg's. My current problem is that the photos upload with no loss of quality but have a larger than expected file size. For example: I uploaded a file through the app and the file size was 4.7 MB. When I Emailed the same photo to myself and selected the "Actual Photo" option for the email, The size of the photo was only 1.7 MB. A side by side comparison revealed no difference in quality.

Here is how I am uploading the files.

ALAssetsLibrary *library = [ALAssetsLibrary new];
[library getImageAtURL:orderImage.imageUrl with completionBlock:^(UIImage *image)

NSData *fileData = UIImageJPEGRepresentation(image, 1.0)

NSURLRequest *request = [self multipartFormRequestWithMethod:@"POST" path:path parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData)
{
    [formData appendPartWithFileData:fileData name:@"uploadedfile" fileName:fileName mimeType:mimeType];
    [formData appendPartWithFormData:[extraInfo dataUsingEncoding:NSISOLatin2StringEncoding] name:@"extraInfo"];
}];
Michael Smith
  • 45
  • 1
  • 5
  • By the way, in addition to noticing that the file is larger, you'll find that it is, ironically, generally stripped of metadata (e.g. the camera details, geo data if any, etc.). – Rob Dec 30 '14 at 16:57

1 Answers1

5

The problem is UIImageJPEGRepresentation. It does not retrieve the original JPEG, but rather creates a new JPEG. And when you use a compressionQuality of 1 (presumably to avoid further image quality loss), it creates this new representation with no compression (generally resulting in a file larger than the original).

I would advise using getBytes to retrieve the original asset, rather than round-tripping it through a UIImage and getting the data via UIImageJPEGRepresentation:

ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetsLibraryURL resultBlock:^(ALAsset *asset) {
    ALAssetRepresentation *representation = [asset defaultRepresentation];

    // I generally would write directly to a `NSOutputStream`, but if you want it in a
    // NSData, it would be something like:

    NSMutableData *data = [NSMutableData data];

    // now loop, reading data into buffer and writing that to our data stream

    NSError *error;
    long long bufferOffset = 0ll;
    NSInteger bufferSize = 10000;
    long long bytesRemaining = [representation size];
    uint8_t buffer[bufferSize];
    while (bytesRemaining > 0) {
        NSUInteger bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
        if (bytesRead == 0) {
            NSLog(@"error reading asset representation: %@", error);
            return;
        }
        bytesRemaining -= bytesRead;
        bufferOffset   += bytesRead;
        [data appendBytes:buffer length:bytesRead];
    }

    // ok, successfully read original asset; 
    // do whatever you want with it here

} failureBlock:^(NSError *error) {
    NSLog(@"error=%@", error);
}];

--

If you're using the Photos framework introduced in iOS 8, can use PHImageManager to get the image data:

PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:@[assetsLibraryURL] options:nil];
PHAsset *asset = [result firstObject];
if (asset) {
    PHImageManager *manager = [PHImageManager defaultManager];
    [manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
        // use `imageData` here
    }];
}
Rob
  • 415,655
  • 72
  • 787
  • 1,044
  • Would that be better than using `UIImagePNGRepresentation` as well? – gabriel_vincent Aug 04 '15 at 15:33
  • It depends upon the image, but generally, yes, it's much better to refer to original image as shown above. For example, I have a 2448x3264 photo taken on my iPhone for which the original asset was 1.5mb, the JPEG @ 100% quality was 5mb, and the PNG was 13.3mb. That's still better than 32mb uncompressed, and the PNG is lossless, but that's a huge price to pay, especially when the full quality image is so easily retrieved. – Rob Aug 04 '15 at 16:41
  • Hi @Rob... I am using different API in swift to get the image ... func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) { imageToUpload = info[UIImagePickerControllerOriginalImage] as? UIImage } .....Here the size of original image is bigger than the image in Photo gallary...Any explanation on this why this is happening ? – pritam001 Aug 16 '17 at 09:17
  • This is happening because you're not retrieving the asset itself, but rather are asking it to retrieve the asset converted to a `UIImage`, which you then are presumably converting back to JPG or PNG representation. The key is to get the `UIImagePickerControllerPHAsset` in iOS 11 (`UIImagePickerControllerReferenceURL` in prior iOS versions), and use that to retrieve the original asset from the `PHImageManager`. See https://stackoverflow.com/a/32938728/1271826 – Rob Aug 16 '17 at 09:25