1

I am trying to integrate the glGrab code for screen capture on Mac OS under mentioned config and I am currently stuck at an all blue screen being rendered inside my window. I believe that there is some issue with how the image texture has been created, but can't tell what. I am just a couple of weeks old in OpenGL so please go easy on me if I missed something obvious.

I am using the glGrab code as it is except CGLSetFullScreen method (and not even CGLSetFullScreenOnDisplay) because these methods are now deprecated. So this one line of code has been commented out for the time being.

I have been doing some research on this topic since some time now and found another thread on stackoverflow which possibly could have been the complete answer, but it helped much nonetheless. Convert UIImage to CVImageBufferRef

A direct reference to the glGrab code is http://code.google.com/p/captureme/source/browse/trunk/glGrab.c

Community
  • 1
  • 1
CapRend
  • 21
  • 3

1 Answers1

1

The answer to my above question is present below. So no more opengl or glGrab. Use what's best optimized for Mac OSX. This doesn't include the code for capturing the mouse pointer also, but I am sure that if you have landed on this page you're smart enough to figure it out by yourself. Or if someone reading this knows the solution then it's your chance to help the fraternity :) Also this code returns a CVPixelBufferRef. You may choose to send back either the CGImageRef or even the bytestream as it is, just tweak it to your liking. :

void swizzleBitmap(void *data, int rowBytes, int height) {
    int top, bottom;
    void * buffer;
    void * topP;
    void * bottomP;
    void * base;

    top = 0;
    bottom = height - 1;
    base = data;
    buffer = malloc(rowBytes);

    while (top < bottom) {
        topP = (void *)((top * rowBytes) + (intptr_t)base);
        bottomP = (void *)((bottom * rowBytes) + (intptr_t)base);

        bcopy( topP, buffer, rowBytes );
        bcopy( bottomP, topP, rowBytes );
        bcopy( buffer, bottomP, rowBytes );

        ++top;
        --bottom;
    }   
    free(buffer);
}   

CVImageBufferRef grabViaOpenGL() {
    int bytewidth;

    CGImageRef image = CGDisplayCreateImage(kCGDirectMainDisplay);    // Main screenshot capture call

    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));    // Get screenshot bounds

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                            [NSNumber numberWithBool:NO], kCVPixelBufferCGImageCompatibilityKey,
                            [NSNumber numberWithBool:NO], kCVPixelBufferCGBitmapContextCompatibilityKey,
                            nil];

    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                          frameSize.height,  kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
                                          &pxbuffer);


    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                 frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipLast);

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);

    bytewidth = frameSize.width * 4; // Assume 4 bytes/pixel for now
    bytewidth = (bytewidth + 3) & ~3; // Align to 4 bytes
    swizzleBitmap(pxdata, bytewidth, frameSize.height);     // Solution for ARGB madness

    CGColorSpaceRelease(rgbColorSpace);
    CGImageRelease(image);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}
CapRend
  • 21
  • 3