0

I am using Longpress gesture on UIImageview. UIimageview is added in UIscrollview as a subview and scroll view is added as a subview in UIview.I am trying to draw a line on image from that point where user performed the long press gesture to any other random point.random point will set programatically.Currently my long press gesture location is different on screen and point which i am getting is different dont know why ? my code is

- (void)handleLongPressGestures:(UILongPressGestureRecognizer *)sender
{
    NSLog(@"Long Press Gesture");

    CGPoint location = [sender locationInView:self.imageView];


    UIColor * linearcolor=[UIColor whiteColor];

    ///////// code //////


    ////// code end /////






    UIGraphicsBeginImageContext(originalImage.size);


      [originalImage drawAtPoint:CGPointMake(0,0)];



    CGContextRef context = UIGraphicsGetCurrentContext();

    CGContextSetLineWidth(context, 5.0);

     CGPoint * location2;
     location2->x=0.0;
     location2->y=0.0;

       CGContextMoveToPoint(context, location.x, location.y);    
       CGContextAddLineToPoint(context,location2->x,location2->y);
       CGContextSetStrokeColorWithColor(context, [linearcolor CGColor]);
       CGContextStrokePath(context);

    UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    imageView.image=newImage;




  }
svjn
  • 904
  • 2
  • 19
  • 35
iOS_Learner
  • 159
  • 9
  • Pleas check in your method if the image size is the same as the size of the image view you collect the location of the touch from. Also that the origin of the image view is at (0,0). – Matic Oblak Jan 13 '15 at 14:26
  • @MaticOblak please elaborate .... :( – iOS_Learner Jan 13 '15 at 14:50
  • Create a breakpoint in the method you posted and log the image size and the image view frame to confirm that both the location of the recognizer and the location on the context you are drawing to have in fact the same coordinate system which is only if the size of the image equals to the size of the image view and the image view has an origin at (0,0). So print the image size and print the image view frame. – Matic Oblak Jan 13 '15 at 16:10
  • @MaticOblak Thanks a lot for elaboration... It worked.Frame was not equal to the image size. :) – iOS_Learner Jan 14 '15 at 05:19
  • @MaticOblak is there any other way to resolve the issue.I mean what if I don't want to change the size of image according to the frame.Means imagesize will be different and frame will be different.Then how can I map the gesture location according to the UIImageview frame. – iOS_Learner Jan 14 '15 at 07:29

1 Answers1

0

It is quite common that the image size is not the same as the image view size. This results in the gesture location being presented in a different coordinate system than the context coordinate system. You need to either make the sizes same or more preferably change the gesture location into the context coordinate system.

So if you have the image view size viewSize and a context size imageSize and the location in image view viewLocation you need to get the correct location in context contextLocation. It is quite simple to do so, simply create a relative point by dividing the origin location with the origin size and then multiply it by a target size:

CGPoint relativePoint = CGPointMake(viewLocation.x/viewSize.width, viewLocation.y/viewSize.height);
CGPoint contextLocation = CGPointMake(relativePoint.x* imageSize.width, relativePoint.y* imageSize.height);

Or simply:

CGPoint contextLocation = CGPointMake(viewLocation.x*imageSize.width/viewSize.width, viewLocation.y*imageSize.height/viewSize.height);

This is very commonly used to transition between coordinate systems but things can get more complicated in cases where you need to include the origin as well and cases such as one of the axis being flipped. For instance by default the openGL coordinate system will be in X axis from -1 to 1 and Y axis from 1 to -1 in respect to the iOS view frame. In this case you would convert the gesture location to the openGL coordinate system like so:

CGPoint contextLocation = CGPointMake((viewLocation.x/viewSize.width)*2.0f - 1.0f, (1.0f-viewLocation.y/viewSize.height)*2.0f - 1.0f);

Might get handy some day.

Matic Oblak
  • 16,318
  • 3
  • 24
  • 43
  • once again I need your assistance. Now I have added zoom in feature for UIImageView.When zoomed in my image again its not pointing the exact location but in normal mode its working fine.Though I am getting the relative location.... :( – iOS_Learner Jan 14 '15 at 11:22
  • Yes, your image view now has a transform since it is most likely returned by your zoom delegate method. What you should do is create an empty view with the same size as your image view. Add the image view to this empty view. Add the empty view to the scroll view. Return the empty view in the delegate method. Collect gesture location from the image view as you already did. Rest of the code should be the same. – Matic Oblak Jan 14 '15 at 12:03
  • Thanks once again.... now its working fine. Getting some minor issues regarding page scroll and its area , but I hope it will be resolved soon. :) – iOS_Learner Jan 15 '15 at 04:53
  • can you check this question please ....http://stackoverflow.com/questions/27958089/remove-line-from-uiimageview?noredirect=1#comment44310735_27958089 – iOS_Learner Jan 15 '15 at 07:43