0

I'm currently developing a small hobby project for Android using OpenGL ES 2.0 and I'm looking into the differences between textures and renderbuffers as render targets of a shader. From what I understand, you can't sample renderbuffer data directly in a shader, which forces you to use textures when that becomes a necessity.

But when I look into extensions supported by my phone, I saw that the extension "GL_OES_rgb8_rgba8" allows you to define a renderbuffer that uses an rgba8 format to use for the color attachment of an FBO, not to mention that "vanilla" OGL ES2 itself already supports attaching a renderbuffer to the color attachment of an FBO.

My question is, why does this feature exist? What use does it have to render color to a renderbuffer when you can't sample from it? Am I missing some key information or programming tactics that make this useful?

Lolslayer
  • 63
  • 8
  • 1
    In old GL you could sample color attachments in CPU code by `glReadPixels` it was used for many things like fast O(1) pixel perfect mouse selection of rendered objects and much more however in these days drivers this feature is no longer reliable :(. The only thing that works this way (I know of) is Stencil but that is only 8bit which limits the use a lot. Here an example [OpenGL 3D-raypicking with high poly meshes](https://stackoverflow.com/a/51764105/2521214) – Spektre May 26 '20 at 06:51
  • Hey Spektre, thanks for commenting to my question? So from my understanding, you mean that renderbuffers were viable to use for CPU accessing? – Lolslayer May 26 '20 at 08:57
  • 1
    Yes among other things ... I think they where added to GL in pre shader times where you got just fixed function pipeline. – Spektre May 26 '20 at 09:31
  • That makes sense, thank you! – Lolslayer May 27 '20 at 14:55

0 Answers0