28

I need the same functionality as the application Instant Heart Rate.

The basic process requires the user to:

  1. Place the tip of the index finger gently on the camera lens.
  2. Apply even pressure and cover the entire lens.
  3. Hold it steady for 10 seconds and get the heart rate.

This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger.

How can I get the light level data from the video capture? Where should I look for this? I looked through the class AVCaptureDevice but didn't find anything useful.

I also found AVCaptureDeviceSubjectAreaDidChangeNotification, would that be useful?

Jack Lawrence
  • 10,664
  • 1
  • 47
  • 61
ellenpage
  • 281
  • 1
  • 4
  • 5
  • but what about non flash models of iphone and ipad? – The iOSDev May 18 '12 at 11:58
  • @AalokParikh: if you have enough light in your environment the phone flash is not necessary. – alinoz May 18 '12 at 13:01
  • @alinoz The phone flash *is* necessary for this application of the camera. With the finger against the lens, you would just see blackness otherwise. – occulus Sep 07 '12 at 15:48
  • @occoulus, I don't know exactly how is working with the iPhone camera but with a normal webcam if the environment is light enough there is no need of extra light. – alinoz Sep 08 '12 at 12:54
  • @TheLion when not enough light lift up your mobile device (while having your finger on the camera of the device) and hold it against a bright light source: for me it worked with a daylight through the window. – Developer Marius Žilėnas May 18 '15 at 12:06

3 Answers3

27

Check out this..

// switch on the flash in torch mode  
 if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {  
 [camera lockForConfiguration:nil];  
 camera.torchMode=AVCaptureTorchModeOn;  
 [camera unlockForConfiguration];  
 }  

  [session setSessionPreset:AVCaptureSessionPresetLow];

   // Create the AVCapture Session  
   session = [[AVCaptureSession alloc] init];  

  // Get the default camera device  
   AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];  
  if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {  
    [camera lockForConfiguration:nil];  
  camera.torchMode=AVCaptureTorchModeOn;  
    [camera unlockForConfiguration];  
 }  
 // Create a AVCaptureInput with the camera device  
    NSError *error=nil;  
     AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];  
   if (cameraInput == nil) {  
    NSLog(@"Error to create camera capture:%@",error);  
  }  

    // Set the output  
    AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];  

   // create a queue to run the capture on  
  dispatch_queue_t captureQueue=dispatch_queue_create("catpureQueue", NULL);  

   // setup our delegate  
   [videoOutput setSampleBufferDelegate:self queue:captureQueue];  

    // configure the pixel format  
    videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber     numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,  
     nil];  
   // cap the framerate  
   videoOutput.minFrameDuration=CMTimeMake(1, 10);  
  // and the size of the frames we want  
  [session setSessionPreset:AVCaptureSessionPresetLow];  

   // Add the input and output  
   [session addInput:cameraInput];  
   [session addOutput:videoOutput];  

   // Start the session  

    [session startRunning];  

   - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {  



   // this is the image buffer  

  CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(sampleBuffer);  


   // Lock the image buffer  

  CVPixelBufferLockBaseAddress(cvimgRef,0);  


  // access the data  

  int width=CVPixelBufferGetWidth(cvimgRef);  
  int height=CVPixelBufferGetHeight(cvimgRef);  


  // get the raw image bytes  
  uint8_t *buf=(uint8_t *) CVPixelBufferGetBaseAddress(cvimgRef);  
  size_t bprow=CVPixelBufferGetBytesPerRow(cvimgRef);  


// get the average red green and blue values from the image  

 float r=0,g=0,b=0;  
 for(int y=0; y<height; y++) {  
 for(int x=0; x<width*4; x+=4) {  
  b+=buf[x];  
  g+=buf[x+1];  
  r+=buf[x+2];  
 }  
 buf+=bprow;  
 }  
  r/=255*(float) (width*height);  
  g/=255*(float) (width*height);  
  b/=255*(float) (width*height);  

  NSLog(@"%f,%f,%f", r, g, b);  
  }  

Sample Code Here

3

In fact can be simple, you have to analyze the pixel values of the captured image. One simple algorithm would be: select and area in the center of the image, convert to gray scale, get the median value of the pixel for each image and you will end up with a 2D function and on this function calculate the distance between to minimums or maximum and problem solved.

If you have a look at the histogram of the acquired images over a period of 5 seconds, you will notice the changes of the gray level distribution. If you want a more robust calculation analyze the histogram.

alinoz
  • 2,822
  • 22
  • 38
  • 2
    Hi alinoz, would it be possible for you to post some sample code? Thanks in advance! – Brabbeldas Dec 18 '12 at 20:03
  • @AT_AB and Brabbeldas - I will try to make a small setup next days. – alinoz Jan 21 '13 at 13:47
  • 1
    @alinoz - did you get a chance to put up a sample project / code for this? Do let us know if you put something together – Sam B May 06 '13 at 03:44
  • @allinoz i didn't get how the problem is solved after we find the distance between 2 minimum or maximum ,can you please elaborate on that – kkk Dec 28 '15 at 08:35
  • @kkk use the timestamp of each captured image. – alinoz Dec 28 '15 at 08:59
  • @alinoz so according to the algorithm i have to calculate each frame median and then 2 maximum(or minimum) median values need to be taken with there time of occurring and then calculate the heart rate,am I correct. – kkk Dec 28 '15 at 11:02
3

As a side note, you may be interested in this research paper. This method does not even require a finger (or anything) directly on the lens.

DCS
  • 3,354
  • 1
  • 24
  • 40