So my app lets a user select a photo from the iOS device's Photo Library. When selected, I check for GPS Metadata:
NSString *mediatType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediatType isEqualToString:(NSString *)kUTTypeImage]) {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
if (url) {
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
[assetsLibrary assetForURL:url
resultBlock:^(ALAsset *myasset) {
CLLocation *photoLocation = [myasset valueForProperty:ALAssetPropertyLocation];
DLog(@"photoLocation: %@", photoLocation);
}
failureBlock:^(NSError *error) {
DLog(@"can't get image: %@", [error localizedDescription]);
}];
}
}
I'm a bit lost now at this point, I need to convert the Photo to NSData for local storage and then to Base64 to upload to our server.
If I use:
NSData *photoData = UIImageJPEGRepresentation([info objectForKey:UIImagePickerControllerOriginalImage], 1);
Will that wrap up the metadata into the image file/data so when I upload, the server will be able to read the metadata in the image file?