2

Possible Duplicate:
How Do I Get The Correct Latitude and Longitude From An Uploaded iPhone Photo?

I need to access geocode location (latitude & longitude) information from photos taken with the iPhone camera and chosen from the Camera Roll. First thing first, photos taken with the camera...

I'm using the following code once the image is taken with the camera. I'm then pulling the image metadata out and storing it into an NSMutableDictionary. I however do NOT see any GPS information in the dictionary... Where is it??? How do I access this information. I can't find good documentation on this either...

Code below:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{    
    // snip out code above to keep this question short.

    NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
    if([mediaType isEqualToString:(__bridge NSString *)kUTTypeImage])
    {
        // images taken with camera have direct access to the metadata by using UIImagePickerControllerMediaMetadata
        NSMutableDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:[info objectForKey:UIImagePickerControllerMediaMetadata]];
        NSLog(@"dictionary: %@", metadata);

        // rest of code snipped out to keep this question short

NSLog outputs the following info - but NO GPS info

dictionary: {
DPIHeight = 72;
DPIWidth = 72;
Orientation = 6;
"{Exif}" =     {
    ApertureValue = "2.970853654340484";
    BrightnessValue = "-1.198278557694757";
    ColorSpace = 1;
    DateTimeDigitized = "2012:04:11 21:55:29";
    DateTimeOriginal = "2012:04:11 21:55:29";
    ExposureMode = 0;
    ExposureProgram = 2;
    ExposureTime = "0.06666666666666667";
    FNumber = "2.8";
    Flash = 24;
    FocalLength = "3.85";
    ISOSpeedRatings =         (
        1000
    );
    MeteringMode = 5;
    PixelXDimension = 2592;
    PixelYDimension = 1936;
    SceneType = 1;
    SensingMethod = 2;
    Sharpness = 1;
    ShutterSpeedValue = "3.911199862602335";
    SubjectArea =         (
        1295,
        967,
        699,
        696
    );
    WhiteBalance = 0;
};
"{TIFF}" =     {
    DateTime = "2012:04:11 21:55:29";
    Make = Apple;
    Model = "iPhone 4";
    Software = "5.0.1";
    XResolution = 72;
    YResolution = 72;
};

I've also tried this another way by using ALAssetLibrary however I never enter my if statement, only the else portion - which obviously produces the same information as above. Outcome of that, useless as I still don't have any GPS data.

Code for that is below:

create an @property of type NSMutableDictionary called photoMetadataDictionary_

then in the .m file do the following:

if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.1f) 
    {
        NSURL* assetURL = nil;
        if ((assetURL = [info objectForKey:UIImagePickerControllerReferenceURL])) 
        {
            // I never get here... Not sure why...


            ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
            [library assetForURL:assetURL resultBlock:^(ALAsset *asset)  
                {
                    NSDictionary *metadata = asset.defaultRepresentation.metadata;
                    [photoMetadataDictionary_ addEntriesFromDictionary:metadata];
                }
                    failureBlock:^(NSError *error) 
            {
            }];
        }
        else 
        {
            NSDictionary *metadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
            if (metadata)
            {
                [photoMetadataDictionary_ addEntriesFromDictionary:metadata];
            }
        }
    }
Community
  • 1
  • 1
ElasticThoughts
  • 3,417
  • 8
  • 43
  • 58
  • 1
    Have you tried the accepted answer in http://stackoverflow.com/questions/9319465/how-do-i-get-the-correct-latitude-and-longitude-from-an-uploaded-iphone-photo ? – Rok Jarc Apr 12 '12 at 08:15
  • Using the information in the post you've suggested did do the trick, will that logic work for both situations? That being photos chosen from the Camera Roll and photos taken with the camera? I'm only able to test with the simulator at the moment. PS - if you would like to post your comment as an answer I will mark it as accepted. – ElasticThoughts Apr 12 '12 at 18:19
  • It should - metadata is already 'baked' into image data at the moment your code recieves it. – Rok Jarc Apr 13 '12 at 07:26
  • 1
    @rokjarc, using the code in my question metadata is returned but only EXIF and TIFF information. However when I changed my code and followed the suggested example I get metadata with EXIF, TIFF and GPS information. Not sure why but at least I have a solution to my problem. – ElasticThoughts Apr 15 '12 at 12:29
  • Tnx for the info - you should write the answer to your question and accept it after 24hrs. – Rok Jarc Apr 15 '12 at 14:53
  • @rokjarc - it seems that this is not working for images taken directly with the camera when returning from UIIMagePicker. I've posted another question for this specific issue here http://stackoverflow.com/q/10166575/1104563 – ElasticThoughts Apr 16 '12 at 11:31

0 Answers0