I'm attempting to implement an iOS camera view that takes pictures that are square in shape (similar to Instagram). My code appears below. The first part, where the frame height is set to be equal to the frame width, is working as expected and the user is given a view that is square. The problem occurs later when I attempt to apply the frame (which is a CGRect property) to the image data using CGImageCreateWithImageInRect. I pass the frame rect to this method with the image. But the results are not cropped to be square. Instead the image retains the original default dimensions from the iOS camera. Can someone please tell me what I've done wrong? My understanding from the Apple documentation is that CGImageCreateWithImageInRect should select an image area of shape Rect from some starting x/y coordinate. But that doesn't seem to be happening.
//Set the frame size to be square shaped
UIView *view = imagePicker.view;
frame = view.frame;
frame.size.height = frame.size.width;
view.frame = frame;
//Crop the image to the frame dimensions using CGImageCreateWithImageInRect
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self.popoverController dismissPopoverAnimated:true];
NSString *mediaType = [info
objectForKey:UIImagePickerControllerMediaType];
[self dismissModalViewControllerAnimated:YES];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info
objectForKey:UIImagePickerControllerOriginalImage];
croppedImage = (__bridge UIImage *)(CGImageCreateWithImageInRect((__bridge CGImageRef)(image), frame));
imageView.image = croppedImage;
}
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
// Code here to support video if enabled
}
}