My app pick a photo from camera roll and store it as NSData
.
When I need to show the image in my app I create the UIImage
from NSData
.
If the original photo is an image I save from web, google image or so, the UIImage
has right orientation, if the original photo is a portrait photo I shoot with the iPhone, the UIImage
is rotated 90 degrees left.
I cannot imagine how to know which photo needs to rotate and which are OK.
any help is appreciated.
the code I use to save image as NSData in a coredata entity:
asset.requestContentEditingInputWithOptions(PHContentEditingInputRequestOptions()) { (input, info) in
if let imgurl = input?.fullSizeImageURL {
let img = UIImage(data: NSData(contentsOfURL: imgurl)!)
self.gearImage = img
self.photoImageView.contentMode = UIViewContentMode.ScaleAspectFill
self.photoImageView.image = self.gearImage
}
}
then I save the NSData in coredata:
if self.gearImage != nil {
newItem.cgPhoto = UIImagePNGRepresentation(self.gearImage!)
}
When I have to show the image I fetch the coredata entity and:
let img = UIImage(data: self.imageData)
And here is the problem. If the phasset fullSizeImageOrientation is null or 1 the image is OK, else has wrong rotation.
Max