I am trying to send a PHAsset image over email. I am first trying to get a base case going in a sandbox app, and then I want to eventually iterate over an array of PHAssets. Below is my sandbox code, followed by a couple of questions/issues I am having. (Some of this code snippet came from this SO post.) And this is all part of trying to implement the multi-asset picker I found called TLPHAssets, which looked to be the most Swift-4 / PHAsset current version from this list on another SO post.
func dismissPhotoPicker(withTLPHAssets: [TLPHAsset]) {
// use selected order, fullresolution image
self.selectedAssets = withTLPHAssets
let mail = MFMailComposeViewController()
mail.mailComposeDelegate = self;
mail.setToRecipients(["emailaddress.com"])
// Put the rest of the email together and present the draft to the user
mail.setSubject("Subject")
let options = PHImageRequestOptions()
options.isNetworkAccessAllowed = true
options.version = .current
options.deliveryMode = .opportunistic
options.resizeMode = .fast
let asset : PHAsset = self.selectedAssets[0].phAsset! as PHAsset
PHImageManager.default().requestImage(for: asset, targetSize: CGSize(width : 400, height : 400), contentMode: .aspectFit, options: options, resultHandler: {(result: UIImage!, info) in
if let image = result
{
let imageData: NSData = UIImageJPEGRepresentation(image, 1.0)! as NSData
mail.addAttachmentData(imageData as Data, mimeType: "image/jpg", fileName: "BeforePhoto.jpg")
}
})
Two questions (I'm pretty new to Swift so apologies if I'm missing something basic):
I'm having trouble getting the resultHandler block of code to ever get executed. I had targetSize set to PHImageManagerMaximumSize, but then changed it to 400x400 on another post's recommendation (by the way, I'm just picking an image from my camera roll and I've confirmed in my test case that the mediaType of the PHAsset is 'image'; not working with Live Photo or Video yet, although I plan to tackle those next, and I'll add a switch statement to ensure I call the right asset-request Handler). Why wouldn't PHImageManagerMaximumSize work? And when I run the app in the simulator and pick one of the five canned images, I still can't get into the resultHandler block of code (and those are all mediaType 'image'). Any idea what might be wrong with my requestImage call that is preventing the resultHandler from being called?
I did eventually get an image from my camera roll to cause the resultHandler code to be called with targetSize at 400x400. But now inside the resultHandler, my sandbox app is crashing on the "let imageData: NSData ..." line with "2017-10-26 15:35:26.604997-0700 MultiAssetPicker[15766:4904592] fatal error: unexpectedly found nil while unwrapping an Optional value".
Any and all help very much appreciated.