I am trying to get the color of pixels/points (doesn't matter for my use case) of the current screen content in iOS. So, for example, I want to get the color of each pixel from screen coordinates 0, 0
to 10, 10
. Additionally, the operation should be as fast as possible, since I will do it at regular intervals as a Timer. The timer should run multiple times a second, but it doesn't have to be 25fps.
Acceptable solutions:
Anything that returns the current color of a pixel or point on screen at a given position, doesn't produce noticable UI lag and doesn't turn my app into a battery hog. The result might be a CGImage, UIImage, buffer array, I don't really care. I also don't care if the solution uses additional Apple frameworks, such as OpenGL or Metal.
It is also acceptable if the solution does not capture system-UI, like the statusbar. Capturing the content of my app is sufficient.
Things I tried so far:
- Using
UIWindow
sdrawHierarchy(in:afterScreenUpdates:)
. This method turns out to be way too slow. On my iPad Pro, it took 0.25s which causes noticable UI lag. - Using
CALayer
srender(in:)
, but this method does not renderUIVisualEffectView
s, which I require. Also, while faster than drawHierarchy, I measured it at about 0.04s, which still causes noticable lag in the UI. - Use OpenGL, as for example described here. I don't know anything about OpenGL, so I might be using this wrong, but I never got it to return anything other than a black image.