1

Okay, so I have this project that I am working on. The task is to measure heart rate using the IPhone/Ipad camera. I am trying the capture the video using AVFoundation, get each frame and sum the red component of every pixel in the frame and divide it by the size to get the average.

I first setup the video

-(void) setupAVCapture{
    _session = [[AVCaptureSession alloc] init];
    _session.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!error) {
        if ([device lockForConfiguration:&error]) {
            if ([device hasTorch] && [device isTorchModeSupported:AVCaptureTorchModeOn]) {
               [device setTorchMode:AVCaptureTorchModeOn];
            }
            [device unlockForConfiguration];
        }
        if ( [_session canAddInput:input] )
            [_session addInput:input];
        AVCaptureVideoDataOutput *videoDataOutput = [AVCaptureVideoDataOutput new];


        [videoDataOutput setVideoSettings:@{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) }];
        [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
        dispatch_queue_t videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        [videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];

        if ( [_session canAddOutput:videoDataOutput] )
            [_session addOutput:videoDataOutput];


        [_session startRunning];
    }
    else{
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[NSString stringWithFormat:@"Failed with error %d", (int)[error code]] message:[error localizedDescription] delegate:nil cancelButtonTitle:@"Dismiss" otherButtonTitles:nil];
        [alertView show];
    }
}

And then use the delegate method as follows -

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
     // got an image
     CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
     //redScores property is an array which stores the red values of all the frames
     [self.redScores addObject: [NSNumber numberWithFloat:[self processPixelBuffer:pixelBuffer]]];
 }


-(float) processPixelBuffer:(CVPixelBufferRef) pixelBuffer{
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixels = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
    int meanRedPixelWeight=0.0;
    for (int i = 0; i < (bufferWidth * bufferHeight); i++) {
        meanRedPixelWeight += pixels[2];
    }
    meanRedPixelWeight=meanRedPixelWeight/(bufferWidth*bufferHeight);
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    NSLog(@"%d",meanRedPixelWeight);
    return meanRedPixelWeight;
}

But this doesn't seem to give me the correct red values. For one, I see the values constantly decreasing. It should be going up and down. Secondly, I took the video and processed in matlab by doing something like -

v=VideoReader('filepath');
noOfFrames = v.NumberOfFrames; 
x=zeros(1,numFrames);
for i=1:noOfFrames,
   frame = read(v,1);
   redPlane = frame(:, :, 1);
   x(i) = sum(sum(redPlane)) / (size(frame, 1) * size(frame, 2));

I get very different average values. The matlab ones are close to 255 and so I can tell they are correct because all the frames are almost fully red.

Any ideas on what is wrong with the objective-c code?

  • 2
    May I make a suggestion? You can avoid having to write any of this code if you use my framework: https://github.com/BradLarson/GPUImage and feed the output from a GPUImageVideoCamera into a GPUImageAverageColor operation. That will give you a callback block which will hand you the average red, green, and blue values for each video frame as they come in. It's also GPU-accelerated, so it's pretty fast. – Brad Larson Mar 25 '14 at 22:06
  • You're way deeper than me on CV, so I won't be much help, but (a) why do you de-ref pixels[2]? is alpha first?, and (b) I've used Brad Larson's GPUImage and it's big help. Thanks @BradLarson! – danh Mar 25 '14 at 22:08
  • Thank you very much @BradLarson. I implemented it using GPUImage. It seems to work fine. Although I noticed one issue the average red value always seem to go down. it starts of at 254 and keeps going down in steps of ~0.01. I stopped after reaching 230. – user3461705 Mar 26 '14 at 01:00
  • @danh - I think alpha is pixels[3]. – user3461705 Mar 26 '14 at 01:00
  • @user3461705 - The iOS camera will always attempt to correct white balance and luminance, so I don't think you can rely on red values by themselves. Others have used red / blue ratios or crossings, or converted to an HSV colorspace to remove the effect of luminance changes: http://stackoverflow.com/questions/19773631/ios-heart-rate-detection-algorithm – Brad Larson Mar 26 '14 at 04:15
  • Thank you @BradLarson . I will try doing that! One other thing I've been trying to find out how to get the frame rate of the recording? I see that there is a framerate property on GPUImageVideoCamera object, but that is always a 0. Any ideas? – user3461705 Mar 28 '14 at 18:12
  • Framerate is variable (the iOS camera will reduce framerate in lower illuminations), so you'd probably be best served by looking at the frame times for each frame as it comes in. You can get the between-frame times as a measure of instantaneous framerate. – Brad Larson Mar 28 '14 at 18:15
  • I am moving over a 6 second window to calculate the heart rate! so I am averaging the framerate over the 6 second window. Thank you @BradLarson ! – user3461705 Mar 29 '14 at 01:27
  • @BradLarson, I am facing an issue when I try to update UI elements like a label, from the block code of setColorAverageProcessingFinishedBlock. It doesnt update the label. Is it something with the GPU - CPU communication? – user3461705 Mar 29 '14 at 22:57
  • I have something like - __weak typeof(self) weakSelf = self; [averageColor setColorAverageProcessingFinishedBlock:^(CGFloat redComponent, CGFloat greenComponent, CGFloat blueComponent, CGFloat alphaComponent, CMTime frameTime){ weakself.promptLabel.text = @"Detecting.."; }]; – user3461705 Mar 29 '14 at 23:00

1 Answers1

0

I appreciate that you've found another solution but, for reference, this would have worked, I believe:

- (float) processPixelBuffer:(CVPixelBufferRef) pixelBuffer{
  CVPixelBufferLockBaseAddress(pixelBuffer, 0);
  size_t pixelWidth = CVPixelBufferGetWidth(pixelBuffer);
  size_t pixelHeight = CVPixelBufferGetHeight(pixelBuffer);
  size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
  uint8_t *sourceBuffer = (uint8_t*)CVPixelBufferGetBaseAddress(pixelBuffer);
  CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
  int bufferSize = bytesPerRow * height;
  uint8_t *pixels = malloc(bufferSize);
  memcpy(pixels, sourceBuffer, bufferSize);      
  int totalRedPixelWeight = 0;
  for (int i = 0; i < pixelHeight; i++) {
    for (int ii = 0; ii < pixelWidth; ii+=4) {
      int redLoc = (bufferHeight * i) + ii + 2;
      totalRedPixelWeight += pixels[redLoc];
    }
  }
  free (pixels);
  float meanRedPixelWeight = totalRedPixelWeight/(pixelWidth*pixelHeight);
  NSLog(@"%f",meanRedPixelWeight);
  return meanRedPixelWeight;
}
Wildaker
  • 2,533
  • 1
  • 17
  • 19