8

My current setup is as follows (based on the ColorTrackingCamera project from Brad Larson):

I'm using a AVCaptureSession set to AVCaptureSessionPreset640x480 for which I let the output run through an OpenGL scene as a texture. This texture is then manipulated by a fragment shader.

I'm in need of this "lower quality" preset because I want to preserve a high framerate when the user is previewing. I then want to switch to a higher quality output when the user captures a still photo.

First I thought I could change the sessionPreset on the AVCaptureSession but this forces the camera to refocus which break usability.

[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession commitConfiguration];

Currently I'm trying to add a second AVCaptureStillImageOutput to the AVCaptureSession but I'm getting an empty pixelbuffer, so I think I'm kinda stuck.

Here's my session setup code:

...

// Add the video frame output
[captureSession beginConfiguration];

videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

if ([captureSession canAddOutput:videoOutput])
{
    [captureSession addOutput:videoOutput];
}
else
{
    NSLog(@"Couldn't add video output");
}

[captureSession commitConfiguration];



// Add still output
[captureSession beginConfiguration];
stillOutput = [[AVCaptureStillImageOutput alloc] init];

if([captureSession canAddOutput:stillOutput])
{
    [captureSession addOutput:stillOutput];
}
else
{
    NSLog(@"Couldn't add still output");
}

[captureSession commitConfiguration];



// Start capturing
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
if(![captureSession isRunning])
{
    [captureSession startRunning];
};

...

And here is my capture method:

- (void)prepareForHighResolutionOutput
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    [stillOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:
     ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
         CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
         CVPixelBufferLockBaseAddress(pixelBuffer, 0);
         int width = CVPixelBufferGetWidth(pixelBuffer);
         int height = CVPixelBufferGetHeight(pixelBuffer);

         NSLog(@"%i x %i", width, height);
         CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
     }];
}

(width and height turn out to be 0)

I've read through the documents of the AVFoundation documentation but it seems I'm not getting something essential.

Community
  • 1
  • 1
polyclick
  • 2,704
  • 4
  • 32
  • 58
  • How fast do you need this preview to go? With an AVCaptureStillImageOutput with the video preview running (supported on iOS 4.3+), I'm seeing 20 FPS on the input frames from that video preview. It's not the full 30 FPS of the video camera, but it's plenty fine for a preview of a scene you want to photograph. – Brad Larson Aug 21 '12 at 15:31
  • Well, I found a solution a few minutes ago. I'm now using the `AVCaptureStillImageOutput` preset but I had to explicitly set the `outputSettings` to avoid a conversion between colorspaces. I'll post it as an answer right away. – polyclick Aug 21 '12 at 16:06

1 Answers1

3

I found the solution for my specific problem. I hope it can be used as a guide if someone stumbles upon the same problem.

The reason why the framerate dropped significantly had to do with an internal conversion between pixel formats. After setting the pixelformat explicitly the framerate increased.

In my situation, I was creating a BGRA texture with the following method:

// Let Core Video create the OpenGL texture from pixelbuffer
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, NULL,
                                                            GL_TEXTURE_2D, GL_RGBA, width, height, GL_BGRA,
                                                            GL_UNSIGNED_BYTE, 0, &videoTexture);

So when I setup the AVCaptureStillImageOutput instance I changed my code to:

// Add still output
stillOutput = [[AVCaptureStillImageOutput alloc] init];
[stillOutput setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

if([captureSession canAddOutput:stillOutput])
{
    [captureSession addOutput:stillOutput];
}
else
{
    NSLog(@"Couldn't add still output");
}

I hope this helps someone someday ;)

polyclick
  • 2,704
  • 4
  • 32
  • 58
  • I am trying your code but I am still clicking at 640 by 480. Any update possibly? – p0lAris Jul 15 '14 at 11:24
  • I have a similar situation in my code, where I'm taking the 640x480 feed but then want to take a hi-res photo. Unfortunately, even with the added `setOutputSettings:` parameter, I still get a width and height of zero. Can you help? – jowie Jan 16 '15 at 13:45
  • Tell me please what do you do to get high quality? Your preset is `AVCaptureSessionPreset640x480`... – vkalit Nov 22 '15 at 23:39
  • Can you please provide final code. i am not getting where to use CVOpenGLESTextureCacheCreateTextureFromImage – Jaydeep Patel Jan 30 '17 at 07:10