In my iOS application I have a texture applied to a sphere rendered in OpenGLES1. The sphere can be rotated by the user. How can I track where a given point on the texture is in 2D space at any given time?
For example, given point (200, 200) on a texture that's 1000px x 1000px, I'd like to place a UIButton on top of my OpenGL view that tracks the point as the sphere is manipulated.
What's the best way to do this?
On my first attempt, I tried to use a color-picking technique where I have a separate sphere in an off-screen framebuffer that uses a black texture with a red square at point (200, 200). Then, I used glReadPixels()
to track the position of the red square and I moved my button accordingly. Unfortunately, grabbing all the pixel data and iterating it 60 times a second just isn't possible for obvious performance reasons. I tried a number of ways to optimize this hack (eg: iterating only the red pixels, iterating every 4th red pixel, etc), but it just didn't prove to be reliable.
I'm an OpenGL noob, so I'd appreciate any guidance. Is there a better solution? Thanks!