2

Hi Im using the following function to bind a sampleBuffer to a opengl texture, which works well.

void *imageData;  

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
         UIImage * image = [self generateUIImageFromSampleBuffer:sampleBuffer];

         if(imageData == NULL){
             width = CGImageGetWidth(image.CGImage);
             height = CGImageGetHeight(image.CGImage);
             CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
             imageData = malloc( height * width * 4 );
             imgcontext = CGBitmapContextCreate(imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big );
             CGColorSpaceRelease( colorSpace );
         }
         CGContextClearRect( imgcontext, CGRectMake( 0, 0, width, height ) );
         CGContextTranslateCTM( imgcontext, 0, height - height );   
         CGContextDrawImage( imgcontext, CGRectMake( 0, 0, width, height ), image.CGImage );

         glBindTexture( GL_TEXTURE_2D, m_textureId );
         glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
  }

My question is that im trying to copy imageData into a double array so i can loop though all the pixels to do some processing in a different process. Here is the code i am using which in not giving me the results i expect.

   double data[width * height * 4];

    memcpy(data, imageData, width * height * 4);

       for (int y = 0; y < height; y++)
        {
            for (int x = 0; x < width; x += 4)
            {
                double r = data[ y * width + x];
                double g = data[ y * width + x + 1];
                double b = data[ y * width + x + 2];
                double a = data[ y * width + x + 3];
            }
        }

This code is being called directly after the texture is bound but r, g, b never change values. Any ideas what i may be doing wrong

Cheers

cbranch
  • 4,709
  • 2
  • 27
  • 25
user346443
  • 4,672
  • 15
  • 57
  • 80

1 Answers1

3

It looks like you are allocating imageData to store pixel values as 32-bit (4-byte) integers, but then attempting to treat each entry as a double (8 bytes). That's not going to work.

The following post may help:

How to get pixel data from a UIImage (Cocoa Touch) or CGImage (Core Graphics)?

Community
  • 1
  • 1
cbranch
  • 4,709
  • 2
  • 27
  • 25
  • And those “4-byte integers” are really four separate numbers, one for each component in the pixel (red, green, blue, and alpha). There's also the issue of `int` and `double` being different formats (integer vs. floating-point), in addition to different sizes. For both of these reasons, you would need to convert each component to a separate `double` value. – Peter Hosey Oct 30 '11 at 06:39