3

Scenario: I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.

ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
    UIImage *largeimage = [UIImage imageWithCGImage:iref ];
    NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}

I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.

NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];

Problem: Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;

//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or 
-(NSData *) getDataFromUIImage:(UIImage*)image;

//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or 
-(NSData *) getDataFromNSImage:(NSImage*)image;

Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.

What I have tried:

I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).

Sport
  • 8,570
  • 6
  • 46
  • 65
Tayyab
  • 422
  • 5
  • 17
  • 2
    You are performing JPEG compression on the image multiple times, so the bits are inevitably going to change. – NSAdam Dec 13 '13 at 06:53
  • @NSAdam can you suggest a way to avoid this? Any insights into solving the actual problem? – Tayyab Dec 13 '13 at 07:55
  • I have also tried to get a NSData object using CGImageRef but that does not work either. – Tayyab Dec 13 '13 at 08:03
  • What are you really trying to do? – NSAdam Dec 13 '13 at 08:05
  • I want to get some sort of a hash based on the content of the image to detect exact duplicates. Using a sha512 hash of the UIImageJPEGRepresentation works across multiple iOS devices (i.e. duplicates are detected through hashes) but fails on a different platform such as the OS X. – Tayyab Dec 13 '13 at 08:09
  • 1
    Does iOS allow you to access the exact image file as it is stored in the photo library? If the first thing it gives you is already an image representation of some sort, then you may be out of luck as who knows what specific processing it may have done by that point. The reason you can compare across iOS devices is that whatever processing it does is the same on both devices. The image on OS X, however, has no such processing applied. – NSAdam Dec 13 '13 at 08:46
  • iOS does not give me reference to the exact file. I can however get a CGImageRef `- (CGImageRef)CGImageWithOptions:(NSDictionary *)options` which according to the docs is **a convenient way to obtain a CGImage representation of an asset. This method returns the biggest, best representation available, unadjusted in any way.** – Tayyab Dec 13 '13 at 08:55

1 Answers1

3

I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.

//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
    ALAssetRepresentation *rep = [asset defaultRepresentation];
    Byte *buffer = (Byte*)malloc(rep.size);
    NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
    NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
    return assetData;
}

//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
    return [NSData dataWithContentsOfURL:url];
}
Tayyab
  • 422
  • 5
  • 17