1

I am working on openCV for detecting the face .I want face to get cropped once its detected.Till now I got the face and have marked the rect/ellipse around it on iPhone.

Please help me out in cropping the face in circular/elliptical pattern

 (UIImage *) opencvFaceDetect:(UIImage *)originalImage 
 {

cvSetErrMode(CV_ErrModeParent);

IplImage *image = [self CreateIplImageFromUIImage:originalImage];

// Scaling down

/*
Creates IPL image (header and data) ----------------cvCreateImage
CVAPI(IplImage*)  cvCreateImage( CvSize size, int depth, int channels );
*/

IplImage *small_image = cvCreateImage(cvSize(image->width/2,image->height/2),
    IPL_DEPTH_8U, 3);

/*SMOOTHES DOWN THYE GUASSIAN SURFACE--------:cvPyrDown*/
cvPyrDown(image, small_image, CV_GAUSSIAN_5x5);
int scale = 2;

// Load XML
NSString *path = [[NSBundle mainBundle] pathForResource:@"haarcascade_frontalface_default" ofType:@"xml"];
CvHaarClassifierCascade* cascade = (CvHaarClassifierCascade*)cvLoad([path cStringUsingEncoding:NSASCIIStringEncoding], NULL, NULL, NULL);

// Check whether the cascade has loaded successfully. Else report and error and quit

if( !cascade )
{
    NSLog(@"ERROR: Could not load classifier cascade\n");
    //return;
}

//Allocate the Memory storage
CvMemStorage* storage = cvCreateMemStorage(0);

// Clear the memory storage which was used before
cvClearMemStorage( storage );

CGColorSpaceRef colorSpace;
CGContextRef contextRef;


CGRect face_rect;
// Find whether the cascade is loaded, to find the faces. If yes, then:
if( cascade )
{
CvSeq* faces = cvHaarDetectObjects(small_image, cascade, storage, 1.1f, 3, 0, cvSize(20, 20));
cvReleaseImage(&small_image);

// Create canvas to show the results
 CGImageRef imageRef = originalImage.CGImage;
 colorSpace = CGColorSpaceCreateDeviceRGB();
 contextRef = CGBitmapContextCreate(NULL, originalImage.size.width, originalImage.size.height, 8, originalImage.size.width * 4,
                                                colorSpace, kCGImageAlphaPremultipliedLast|kCGBitmapByteOrderDefault);
//VIKAS
CGContextDrawImage(contextRef, CGRectMake(0, 0, originalImage.size.width, originalImage.size.height), imageRef);



CGContextSetLineWidth(contextRef, 4);
CGContextSetRGBStrokeColor(contextRef, 1.0, 1.0, 1.0, 0.5);




// Draw results on the iamge:Draw all components of face in the form of small rectangles

// Loop the number of faces found.

for(int i = 0; i < faces->total; i++) 
    {
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    // Calc the rect of faces
    // Create a new rectangle for drawing the face

    CvRect cvrect = *(CvRect*)cvGetSeqElem(faces, i);
    //  CGRect face_rect = CGContextConvertRectToDeviceSpace(contextRef, 
    //                          CGRectMake(cvrect.x * scale, cvrect.y * scale, cvrect.width * scale, cvrect.height * scale));


     face_rect = CGContextConvertRectToDeviceSpace(contextRef, 
                                                         CGRectMake(cvrect.x*scale, cvrect.y , cvrect.width*scale , cvrect.height*scale*1.25
                                                                    ));

    facedetectapp=(FaceDetectAppDelegate *)[[UIApplication sharedApplication]delegate];
    facedetectapp.grabcropcoordrect=face_rect;

    NSLog(@"  FACE off %f %f %f %f",facedetectapp.grabcropcoordrect.origin.x,facedetectapp.grabcropcoordrect.origin.y,facedetectapp.grabcropcoordrect.size.width,facedetectapp.grabcropcoordrect.size.height);
    CGContextStrokeRect(contextRef, face_rect);
        //CGContextFillEllipseInRect(contextRef,face_rect);
    CGContextStrokeEllipseInRect(contextRef,face_rect);


    [pool release];
}

}
CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage],face_rect);
    UIImage *returnImage = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);


CGContextRelease(contextRef);
CGColorSpaceRelease(colorSpace);

cvReleaseMemStorage(&storage);
cvReleaseHaarClassifierCascade(&cascade);

   return returnImage;
}


}

Thanks Vikas

vikas ojha
  • 217
  • 1
  • 4
  • 18
  • 1
    You left capslock on for the title and you forgot to format your code. Please correct this if you want people to read your question. – PengOne Jun 21 '11 at 23:12
  • I've never done this but I read about it so i give you just a comment. You could draw the eclipse to a mask image and then use CGImageCreateWithMask. Take a look at this example: http://iphonedevelopertips.com/cocoa/how-to-mask-an-image.html – n3on Jun 21 '11 at 23:55
  • @n3on:: I tried using the iphonedevelopertips.com/cocoa/how-to-mask-an-image.html but it jst puts mask over the image but doesnt crop it – vikas ojha Jun 22 '11 at 17:02

2 Answers2

0

Following is the answer I given in How to crop UIImage on oval shape or circle shape? to make the image circle. It works for me..

Download the Support archive file from URL http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/

#import "UIImage+RoundedCorner.h"
#import "UIImage+Resize.h"

Following lines used to resize the image and convert in to round with radius

UIImage *mask = [UIImage imageNamed:@"mask.jpg"];

mask = [mask resizedImage:CGSizeMake(47, 47) interpolationQuality:kCGInterpolationHigh ];
mask = [mask roundedCornerImage:23.5 borderSize:1];

Hope it helps some one..

Community
  • 1
  • 1
Dilip Rajkumar
  • 7,006
  • 6
  • 60
  • 76
0

There are a pile of blend modes to choose from, a few of which are useful for "masking". I believe this should do approximately what you want:

CGContextSaveGState(contextRef);
CGContextSetBlendMode(contextRef,kCGBlendModeDestinationIn);
CGContextFillEllipseInRect(contextRef,face_rect);
CGContextRestoreGState(contextRef);

"approximately" because it'll mask the entire context contents every time, thus doing the wrong thing for more than one face. To handle this case, use CGContextAddEllipseInRect() in the loop and CGContextFillPath() at the end.

You might also want to look at CGContextBeginTransparencyLayerWithRect().

tc.
  • 33,468
  • 5
  • 78
  • 96