5

I am trying to do some OpenGL ES 2.0 rendering to an image file, independent of the rendering being shown on the screen to the user. The image I'm rendering to is a different size than the user's screen. I just need a byte array of GL_RGB data. I'm familiar with glReadPixels, but I don't think it would do the trick in this case since I'm not pulling from an already-rendered user screen.

Pseudocode:

// Switch rendering to another buffer (framebuffer? renderbuffer?)

// Draw code here

// Save byte array of rendered data GL_RGB to file

// Switch rendering back to user's screen.

How can I do this without interrupting the user's display? I'd rather not have to flicker the user's screen, drawing my desired information for a single frame, glReadPixel-ing and then having it disappear.

Again, I don't want it to show anything to the user. Here's my code. Doesn't work.. am I missing something?

unsigned int canvasFrameBuffer;
bglGenFramebuffers(1, &canvasFrameBuffer);
bglBindFramebuffer(BGL_RENDERBUFFER, canvasFrameBuffer);
unsigned int canvasRenderBuffer;
bglGenRenderbuffers(1, &canvasRenderBuffer);
bglBindRenderbuffer(BGL_RENDERBUFFER, canvasRenderBuffer);
bglRenderbufferStorage(BGL_RENDERBUFFER, BGL_RGBA4, width, height);
bglFramebufferRenderbuffer(BGL_FRAMEBUFFER, BGL_COLOR_ATTACHMENT0, BGL_RENDERBUFFER, canvasRenderBuffer);

unsigned int canvasTexture;
bglGenTextures(1, &canvasTexture);
bglBindTexture(BGL_TEXTURE_2D, canvasTexture);
bglTexImage2D(BGL_TEXTURE_2D, 0, BGL_RGB, width, height, 0, BGL_RGB, BGL_UNSIGNED_BYTE, 0);
bglFramebufferTexture2D(BGL_FRAMEBUFFER, BGL_COLOR_ATTACHMENT0, BGL_TEXTURE_2D, canvasTexture, 0);

Matrix::matrix_t identity;
Matrix::LoadIdentity(&identity);
bglClearColor(1.0f, 1.0f, 1.0f, 1.0f);
bglClear(BGL_COLOR_BUFFER_BIT);
Draw(&identity, &identity, this);
bglFlush();
bglFinish();

byte *buffer = (byte*)Z_Malloc(width * height * 4, ZT_STATIC);
bglReadPixels(0, 0, width, height, BGL_RGB, BGL_UNSIGNED_BYTE, buffer);
SaveTGA("canvas.tga", buffer, width, height);
Z_Free(buffer);

// unbind frame buffer
bglBindRenderbuffer(BGL_RENDERBUFFER, 0);
bglBindFramebuffer(BGL_FRAMEBUFFER, 0);
bglDeleteTextures(1, &canvasTexture);
bglDeleteRenderbuffers(1, &canvasRenderBuffer);
bglDeleteFramebuffers(1, &canvasFrameBuffer);
user1054922
  • 2,101
  • 2
  • 23
  • 37
  • 1
    I found some information on render-to-texture, but I'm not sure that's what I want, since I want to render to a byte array for saving/analyzing the image data. I'm curious as to what the correct path is. – user1054922 May 12 '13 at 16:30
  • 1
    Whether the buffer ends up on screen or not the GPU still has to render it. There is no way to have the offscreen rendering not affect the performance... – Justin Meiners May 12 '13 at 16:35
  • I'm not talking about affecting performance, I'm talking about just not showing anything to the user (i.e., a flicker) – user1054922 May 12 '13 at 16:37
  • Non ES versions: http://stackoverflow.com/questions/3191978/how-to-use-glut-opengl-to-render-to-a-file || http://stackoverflow.com/questions/5844858/how-to-take-screenshot-in-opengl – Ciro Santilli OurBigBook.com Mar 26 '16 at 15:37

2 Answers2

5

Here's the solution, for anybody who needs it:

    // Create framebuffer
unsigned int canvasFrameBuffer;
glGenFramebuffers(1, &canvasFrameBuffer);
glBindFramebuffer(GL_RENDERBUFFER, canvasFrameBuffer);

// Attach renderbuffer
unsigned int canvasRenderBuffer;
glGenRenderbuffers(1, &canvasRenderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, canvasRenderBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, width, height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, canvasRenderBuffer);

    // Clear the target (optional)
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);

    // Draw whatever you want here

char *buffer = (char*)malloc(width * height * 3);
glReadPixels(0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, buffer);
SaveTGA("canvas.tga", buffer, width, height); // Your own function to save the image data to a file (in this case, a TGA)
free(buffer);

// unbind frame buffer
glBindRenderbuffer(GL_RENDERBUFFER, 0);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glDeleteRenderbuffers(1, &canvasRenderBuffer);
glDeleteFramebuffers(1, &canvasFrameBuffer);
user1054922
  • 2,101
  • 2
  • 23
  • 37
  • Could you please advise what would you do if width/height was greater than GL_MAX_RENDERBUFFER_SIZE? So that it's not possible to create renderbuffer of desired size... – XZen Jan 19 '15 at 12:18
  • I cannot get it working on linux. What are BGL_RGBA4, ZT_CACHE and Z_MALLOC, SAVE_TGA, Z_FREE? – Adam Hunyadi Nov 27 '16 at 11:47
  • Sorry, these are functions I've built. I've edited my post to replace them with the standard equivalents. – user1054922 Nov 29 '16 at 14:40
1

You can render to a texture, read the pixels and draw a quad with that texture (if you want to show that to the user). It should not flicker but it degrades performance obviously.

On iOS for example:

Community
  • 1
  • 1
Trax
  • 1,890
  • 12
  • 15
  • Updated post with some code I'm trying to render to an offscreen texture. – user1054922 May 12 '13 at 16:45
  • Updated my answer with some related questions with answers at stackoverflow. – Trax May 12 '13 at 16:45
  • I don't see a call to glBindFramebuffer, is it inside Draw? – Trax May 12 '13 at 16:49
  • Draw() just contains the basic OpenGL drawing code... binding the arrays and glDrawArrays and such. OpenGL ES 2.0 doesn't have glReadBuffer...? I read your links, one of them posts a 'solution', but it looks highly iOS-specific. Looking for something that's GL-only. – user1054922 May 12 '13 at 17:06
  • Yeah my bad, you dont need glReadBuffer, too many versions :) – Trax May 12 '13 at 17:47