I am using a custom path animation on UIImageView items for a Swift 3 project. The code outline is as follows:
// parentView and other parameters are configured externally
let imageView = UIImageView(image: image)
imageView.isUserInteractionEnabled = true
let gr = UITapGestureRecognizer(target: self, action: #selector(onTap(gesture:)))
parentView.addGestureRecognizer(gr)
parentView.addSubview(imageView)
// Then I set up animation, including:
let animation = CAKeyframeAnimation(keyPath: "position")
// .... eventually ....
imageView.layer.add(animation, forKey: nil)
The onTap method is declared in a standard way:
func onTap(gesture:UITapGestureRecognizer) {
print("ImageView frame is \(self.imageView.layer.visibleRect)")
print("Gesture occurred at \(gesture.location(in: FloatingImageHandler.parentView))")
}
The problem is that each time I call addGestureRecognizer, the previous gesture recognizer gets overwritten, so any detected tap always points to the LAST added image, and the location is not detected accurately (so if someone tapped anywhere on the parentView, it would still trigger the onTap method).
How can I detect a tap accurately on per-imageView basis? I cannot use UIView.animate or other methods due to a custom path animation requirement, and I also cannot create an overlay transparent UIView to cover the parent view as I need these "floaters" to not swallow the events.