I'm trying to capture frames from a Macbook Pro's iSight using OpenCV 2.4.6, and built using the Apple LLVM 4.2 compiler on Xcode.
However, I don't receive any frames. Usually I set up a while loop to run until the frame is full, but the one below runs for ~30 seconds with no result. How can I debug this?
void testColourCapture() {
cv::VideoCapture capture = cv::VideoCapture(0); //open default camera
if(!capture.isOpened()) {
fprintf( stderr, "ERROR: ColourInput capture is NULL \n" );
}
cv::Mat capFrame;
int frameWaits = 0;
while (capFrame.empty()) {
capture.read(capFrame);
//capture >> capFrame;
cvWaitKey(30);
frameWaits++;
std::cout << "capture >> capFrame " << frameWaits << "\n";
if (frameWaits > 1000) {
break;
}
}
imshow("capFrame", capFrame);
}
I have ensured it is not multi-threaded. Also, capture.isOpened is always returning true.
EDIT: It appears others have had this problem: OpenCV wont' capture from MacBook Pro iSight
EDIT: My procedure for installing opencv was:
$ sudo port selfupdate
$ sudo port install opencv
Then, I dragged libopencv_core.dylib, libopencv_highgui.dylib, libopencv_imgproc.dylib and libopencv_video.dylib into the Frameworks folder of my Xcode project, from /opt/local/lib