0

I have a very very simple project set up that allows you to click a "browse photo" button. The user then selects a photo from their photo gallery, and it's displayed on a programmatically created UIImageView.

Works like a charm. However - I am missing key functionality that is required.

I need the user to be able to scale the image (via pinching and dragging) after it is displayed within the UIImageView. allowsEditing = true, lets the user crop before. I need similar functionality, however, allowing them to edit once it's on the main UI.

Help is appreciated. Please and thank you!!

import UIKit

class ViewController: UIViewController {

@IBOutlet weak var imageView: UIImageView!


var imageViewLayer: CALayer{
    return imageView.layer
}


override func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view, typically from a nib.
    imageViewLayer.contents = UIImage(named: "ss3.jpg")?.CGImage
    imageViewLayer.contentsGravity = kCAGravityResizeAspect

}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}



@IBAction func newGesture(sender: AnyObject) {
    imageViewLayer.transform = CATransform3DMakeScale(sender.scale, sender.scale, 1)
}

}

Joe
  • 3,772
  • 3
  • 33
  • 64
  • This question is well-posed but very broad. You seem to be asking "how do I make an image editor?" Are you running into any specific problems? – jtbandes Aug 16 '15 at 21:04

1 Answers1

0

I did something similar a while back. I added the image to UIImageView's layer property, added gesture recognizer to the view and implemented the gesture call backs modifying the layer property and not the view. Adding the image to the UIImageView's layer did the trick. As a side note, I would like to add that every UIView is supported by CALayer class. It has a lot of methods and properties which help to dynamically change the view, which in your case will be done by gestures.

As an alternative, you can also use CALayer's hitTest method instead of implementing the call backs for gesture recognizers.

EDIT- Sample Code

You could do some thing like this:

   @IBOutlet weak var imageView: UIImageView!

    var imageViewLayer: CALayer{
        return imageView.layer
    }

In the viewDidLoad, set up the image

imageViewLayer.contents = UIImage(named: "CoreDataDemoApp")?.CGImage
imageViewLayer.contentsGravity = kCAGravityResizeAspect

Add pinch gesture to the imageview in storyboard (or programmatically) and in it's call back you could do something like this:

@IBAction func pinchGestureRecognized(sender: AnyObject) {
   imageViewLayer.transform = CATransform3DMakeScale(sender.scale, sender.scale, 1)  
}

Again this is just to give you an idea of how it could work and it is not the complete code. Hope this helps!

This is another way of doing it:

Stackoverflow link to related question

Community
  • 1
  • 1
  • Thanks for the input! I have edited the code in my original post. I can now zoom the image via UIPinchGestureRecognizer, however, this is zooming the entire view. I just want to zoom the image _within_ photoImageView. I believe this is what you were suggesting. Any insight on how to zoom the photo instead of the whole view would be appreciated! Thanks! – Joe Aug 17 '15 at 04:07
  • Again, thanks for your help here! I have set up the CALayer, along with adding a gesturerecognizer to the imageView. Zoom works, but it's still zooming the entire view. Not the layer. I need the layer to stay within the bounds of the imageView. (if you're familiar with CSS, think of it as overflow:hidden). Any clue how that can be achieved? Thanks – Joe Aug 18 '15 at 02:12