0

Currently I'm developing an Android App for blind people that uses the device's camera.

The app shows in a SurfaceView the preview of the camera's captured frames. When the screen is touched, the app gets the coordinates from the screen, with event.getX() and event.getY(). However, what I really want to do is to obtain the coordinates of the touched point but taking as a reference the coordinates of the frame.

With view.getLocationonScreen() or view.getLocationInWindow() I can get the location of referred to the top left position where SurfaceView begins, but the frame don't occupies all the surface of the SurfaceView.

How can I transform the coordinates of the screen to relative coordinates to the frame? Or how can I know the coordinates where the frame is located inside the Surface View? When I use getMarginLeft() or similar I obtain the value 0.

Jumy Elerossë
  • 189
  • 2
  • 3
  • 12
  • You have to convert firstly `pixel coordinates` to `canvas coordinates`. Check [this example](http://stackoverflow.com/questions/6852822/how-do-i-correctly-translate-pixel-coordinates-to-canvas-coordinates-in-android) – hrskrs Mar 13 '15 at 07:46
  • Thanks for the answer. I can see that the canvas coordenates are obtained like: `Canvas c = new Canvas(); int cx = c.getWidth(); int cy = c.getHeight(); ` However, I don't know how `c` is initialised. Also, I have the problem that the video frame leaves black borders at the sides of the image inside the `SurfaceView`. I would need to adjust the frame in order to fill the whole `SurfaceView` and then use canvas over a `SurfaceView`, or remove that borders somehow. – Jumy Elerossë Mar 17 '15 at 08:51

0 Answers0