1

I want to develop an iPhone app which should recognise the eyes, face, and skin color of a person in an image which is scanned by QR Reader.

How can eyes be detected in an image?

jscs
  • 63,694
  • 13
  • 151
  • 195
Harika
  • 11
  • 1
  • I've tried to make your tags a little more relevant to what you're asking. – Luke Aug 17 '11 at 19:29
  • @Harika: this type of computer-vision problems cannot be tackled in a hurry.. People have done entire research on the topic :) – Amro Aug 17 '11 at 22:31
  • 1
    Urgent? Define urgent? I know a group that has worked on almost this exact problem for 6 years. They are starting to get close.... – John Aug 18 '11 at 14:26

2 Answers2

1

Although it may be possible, I'm just warning you that it will have, regardless of the programming, a degree of inaccuracy. Any face/retina detection software is able to be tricked, and considering the quality of the iPhone's camera, it can't capture enough detail to accurately evaluate the geometric relevancy between two retinas. Also, about recognizing the face color, it might be problematic due to varying light conditions. Under florescent lights, the person's skin tone would appear more blue than under incandescent or natural lighting. Maybe there is another way to go about this?

dwmcc
  • 1,034
  • 8
  • 19
1

For localizing the eyes I've used the algorithm described in "Accurate Eye Center Location and Tracking Using Isophote Curvature" by Roberto Valenti, Theo Gevers in my master thesis and achieved very good results with it:

http://www.science.uva.nl/research/publications/2008/ValentiCVPR2008/CVPR%2008.pdf

For face detection / localization, use the Viola-Jones algorithm, there is probably a objective-c implementation out there somewhere. (OpenCV has it, alternatively)

Efrain
  • 3,248
  • 4
  • 34
  • 61