2

I am working through some existing code for a project i am assigned to.

I have a successful call to glTexImage2D like this:
glTexImage2D(GL_TEXTURE_2D, 0, texture->format, texture->widthTexture, texture->heightTexture, 0, texture->format, texture->type, texture->data);

I would like create an image (preferably a CGImage or UIImage) using the variables passed to glTexImage2D, but don't know if it's possible.

I need to create many sequential images(many of them per second) from an OpenGL view and save them for later use.

Should i be able to create a CGImage or UIImage using the variables i use in glTexImage2D?

If i should be able to, how should i do it?

If not, why can't i and what do you suggest for my task of saving/capturing the contents of my opengl view many times per second?

edit: i have already successfully captured images using some techniques provided by apple with glReadPixels, etc etc. i want something faster so i can get more images per second.

edit: after reviewing and adding the code from Thomson, here is the resulting image: resulting image from Thomson's code

the image very slightly resembles what the image should look like, except duplicated ~5 times horizontally and with some random black space underneath.

note: the video(each frame) data is coming over an ad-hoc network connection to the iPhone. i believe the camera is shooting over each frame with the YCbCr color space

edit: further reviewing Thomson's code I have copied your new code into my project and got a different image as result:

image 2

width: 320 height: 240

i am not sure how to find the number of bytes in texture-> data. it is a void pointer.

edit: format and type

texture.type = GL_UNSIGNED_SHORT_5_6_5

texture.format = GL_RGB

james
  • 26,141
  • 19
  • 95
  • 113
  • 1
    The number of bytes is: int length = width * height * channels; This is only for 8bits per color. Find out what texture->format and texture->type are. If you grab the hex value and search opengl.org's docs you should be able to find out exactly what your data is; how many channels (RGB or RGBA) and if you are sending unsigned bytes (very likely) – v01d Jan 16 '11 at 02:05
  • Hey binny, that GL_UNSIGNED_SHORT_5_6_5 is pretty critical information that's good to know. When I have the time I'll post here with how to unpack it into a format usable to the UIImage code I provided below. – Thomson Comer Jan 22 '11 at 19:25

3 Answers3

3

Hey binnyb, here's the solution to creating a UIImage using the data stored in texture->data. v01d is certainly right that you're not going to get the UIImage as it appears in your GL framebuffer, but it'll get you an image from the data before it has passed through the framebuffer.

Turns out your texture data is in 16 bit format, 5 bits for red, 6 bits for green, and 5 bits for blue. I've added code for converting the 16 bit RGB values into 32 bit RGBA values before creating a UIImage. I'm looking forward to hearing how this turns out.

float width    = 512;
float height   = 512;
int   channels = 4;

// create a buffer for our image after converting it from 565 rgb to 8888rgba
u_int8_t* rawData = (u_int8_t*)malloc(width*height*channels);

// unpack the 5,6,5 pixel data into 24 bit RGB
for (int i=0; i<width*height; ++i) 
{
    // append two adjacent bytes in texture->data into a 16 bit int
    u_int16_t pixel16 = (texture->data[i*2] << 8) + texture->data[i*2+1];      
    // mask and shift each pixel into a single 8 bit unsigned, then normalize by 5/6 bit
    // max to 8 bit integer max.  Alpha set to 0.
    rawData[channels*i]   = ((pixel16 & 63488)       >> 11) / 31.0 * 255;
    rawData[channels*i+1] = ((pixel16 & 2016)  << 5  >> 10) / 63.0 * 255;
    rawData[channels*i+2] = ((pixel16 & 31)    << 11 >> 11) / 31.0 * 255;
    rawData[channels*4+3] = 0;
}

// same as before
int                    bitsPerComponent = 8;
int                    bitsPerPixel     = channels*bitsPerComponent;
int                    bytesPerRow      = channels*width;
CGColorSpaceRef        colorSpaceRef    = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo           bitmapInfo       = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent  = kCGRenderingIntentDefault;

CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, 
                                                          rawData, 
                                                          channels*width*height,
                                                          NULL);
free( rawData );
CGImageRef        imageRef = CGImageCreate(width,
                                           height,
                                           bitsPerComponent,
                                           bitsPerPixel,
                                           bytesPerRow,
                                           colorSpaceRef,
                                           bitmapInfo,
                                           provider,NULL,NO,renderingIntent);

UIImage *newImage = [UIImage imageWithCGImage:imageRef];

The code for creating a new image comes from Creating UIImage from raw RGBA data thanks to Rohit. I've tested this with our original 320x240 image dimension, having converted a 24 bit RGB image into 5,6,5 format and then up to 32 bit. I haven't tested it on a 512x512 image but I don't expect any problems.

Community
  • 1
  • 1
Thomson Comer
  • 3,919
  • 3
  • 30
  • 32
  • ok, i've reviewed + tested this code. i have added an image and some other notes to my question – james Jan 14 '11 at 18:29
  • 1
    Hey binnby, can you list what texture->widthTexture and texture->heightTexture and how many bytes are stored in texture->data? Right now the error you are experiencing is because the size of the allocated UIImage doesn't align with the size of the image in texture->data. It looks like your data doesn't have an alpha component, hence the misalignment. Figuring out the perfect pixel configuration like this can be difficult, especially without me having the actual data. It is best for you to experiment. If each image pixel is in YCbCr format that is another problem we can deal with later. – Thomson Comer Jan 14 '11 at 21:13
  • 2
    The source code above has been modified to support raw RGB data instead of RGBA data. I tested my code with RGB data when channels was set to 4 and got a very similar image artifacting to what you're experiencing now. This ought to do it. – Thomson Comer Jan 14 '11 at 21:25
  • thanks! updated my question with another image and width/height – james Jan 14 '11 at 22:00
  • 1
    Hey binnyb, did you notice I updated my answer to handle your 16 bit texture? – Thomson Comer Feb 03 '11 at 16:15
  • the line `u_int16_t pixel16 = (texture.data[i*2] << 8) + texture.data[i*2+1];` is giving me problems. Invalid operands to binary expression ('void' and 'int'). i am not familiar with fiddling with this type of error! – james Nov 18 '11 at 19:53
  • Wow hey binnyb long time no see! – Thomson Comer Nov 18 '11 at 22:17
  • hiya, i'm back to give this thing another try after a period of nothingness – james Nov 18 '11 at 22:28
1

You could make an image from the data you are sending to GL, but I doubt that's really what you want to achieve.

My guess is you want the output of the Frame Buffer. To do that you need glReadPixels(). Bare in mind for a large buffer (say 1024x768) it will take seconds to read the pixels back from GL, you wont get more than 1 per second.

v01d
  • 1,576
  • 14
  • 19
  • indeed, i should have mentioned i have already successfully captured images via `glReadPixels`, but it is too slow and i am looking for an alternative – james Jan 10 '11 at 22:40
  • 1
    You wont get anything faster on iOS at this point. glReadPixels is the only option to pull back data from GL. – v01d Jan 11 '11 at 01:16
0

You should be able to use the UIImage initializer imageWithData for this. All you need is to ensure that the data in texture->data is in a structured format that is recognizable to the UIImage constructor.

NSData* imageData = [NSData dataWithBytes:texture->data length:(3*texture->widthTexture*texture->heightTexture)];
UIImage* theImage = [UIImage imageWithData:imageData];

The types that imageWithData: supports are not well documented, but you can create NSData from .png, .jpg, .gif, and I presume .ppm files without any difficulty. If texture->data is in one of those binary formats I suspect you can get this running with a little experimentation.

Thomson Comer
  • 3,919
  • 3
  • 30
  • 32
  • 1
    texture->data is raw pixel data. imageWithData: will be unsuccessful on raw data because won't know the image packing format or size. You could make a CGImage from the raw data and then a UIImage from that. But I suspect @binnyb wants updated data from a GL FBO. You wont get it from the texture data as it's copied into an internal buffer inside GL you don't have access to. See http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml – v01d Jan 11 '11 at 01:20
  • 2
    Ok, you can use imageWithCGImage: instead. You'll have to customize the code to reflect your image data, but a previous StackOverflow addresses this problem: http://stackoverflow.com/questions/4545237/creating-uiimage-from-raw-rgba-data You can write the raw image data to a CGImageRef and create your UIImage from that. – Thomson Comer Jan 11 '11 at 18:58
  • any advice on "you'll have to customize the code to reflect your image data"? – james Jan 14 '11 at 14:39