5

I'm pretty sure the answer to this is basically, "No", but perhaps there is a way to do this.

Under iOS, I would like to be able to specify a fragment shader to composite together UIViews, rather than simply have them combine with alpha based blending.

The specific use case I have in mind is I want to have a foreground layer that is scrolling over two background layers: one a day-time image and the other a night-time image. The foreground view will control which of the two background views shows through by linearly blending between them. The two backgrounds will also be scrolling (more slowly than the foreground), giving a parallax effect.

In general, I am frustrated that the iPhone (and any other modern smart phone) has an awesomely powerful GPU chained up inside it, and all the UI asks of it, for the most part, is trivial blending. Wouldn't it be cool to be able to have UIView with a layer representing a surface normal so that as it is rotated, it gets correctly light source shaded? Wouldn't that be great?

Any thoughts? Thanks!

Benjohn
  • 13,228
  • 9
  • 65
  • 127
  • This [question](http://stackoverflow.com/questions/13968233/apply-gpuimage-filter-to-a-uiview) looks like it might give some leads, it mentions [`GPUImage`](https://github.com/BradLarson/GPUImage) an open source BSD licensed library on Git Hub. – Benjohn Nov 26 '13 at 21:51
  • Yeah, I was about to suggest that (of course I would), because if the view content itself isn't animating, you might be able to pull them in with a GPUImageUIElement and apply an animated transform and blend to achieve the effect you want. It will still take some work, but it should be possible that way. I've pulled off some [interesting lighting effects](http://stackoverflow.com/questions/11614162/ios-images-in-a-bubble-effect/11614991#11614991) using shaders and appropriate blends before. – Brad Larson Nov 26 '13 at 23:39
  • Thanks for getting in touch @BradLarson. Do you know, is it possible to "reach in" to the layer compositing so that when my layer is composited in to the scene, instead of just being a texture look up fragment shader, I can specify something a bit more exotic? Alternatively, I only have a cursory idea of how iOS's layers work, most of it inferred – if you know of a good reference, could you point me to it, please? – Benjohn Nov 27 '13 at 10:21
  • We don't really have access to the internals of UIKit's compositing engine, so no, we can't override what is done in UIKit itself. You can roll your own compositing implementation with blended OpenGL ES textures drawn from UIViews and a custom shader lookup, but you're not going to be able to dig into the UIKit compositing engine. Apple hasn't exposed that, at least in part because of security implications around their current implementation. Until iOS 7's blurring, few people wanted this anyway. – Brad Larson Nov 27 '13 at 16:01
  • Didn't know about the security implications at all, thanks @BradLarson – do you have a link discussing that? Perhaps that is something that will come in the future; it feels like it might make sense, doesn't it? Although the current iOS 7 aesthetic seems fairly print like and moving away from cool / realistic shading effects. Unless they're blur, so maybe not. – Benjohn Nov 27 '13 at 23:32

0 Answers0