I am working on an opencv application that is consumes CPU heavily.
I want to make the frame processing distributed so that it will be shared among many hosts.
The idea is the same as that implemented in http://cloudcv.org/. but the problem is that you can only send your request to their server to test distributed image processing.
I search for long time on internet, and I wonder if I can implement opencv + Docker Swarm, or opencv + Apache Spark or if there is some other method to make it distributed.
My code processes frames in opencv to detect people in them, I want to get it executed on many hosts to maximize speed:
while(true)
{
webcam.read(image);
//human detection--------------------------------------
cv::Mat resized_image;
cv::resize(image, resized_image, Size(image.cols / 2, image.rows / 2), 0, 0, INTER_LINEAR);
vector<Rect> found, found_filtered;
// this line uses hog descriptor to detect
// people body pattern in the frmaes
// found is a vector of Rect that contains the
// found peoples.
// Rect is a struct (x, y, height, width)
hog.detectMultiScale(image, found, 0, Size(8, 8), Size(32, 32), 1.05, 2);
size_t u, h;
// this loop just make sure that the found
// rectangles are not duplicated.
for (u = 0; u<found.size(); u++)
{
Rect r = found[u];
for (h = 0; h<found.size(); h++)
if (h != u && (r & found[h]) == r)
break;
if (h == found.size())
found_filtered.push_back(r);
}
// this loop is for drawing the rectangles on the frame
for (u = 0; u<found_filtered.size(); u++)
{
Rect r = found_filtered[u];
r.x += cvRound(r.width*0.1);
r.width = cvRound(r.width*0.8);
r.y += cvRound(r.height*0.07);
r.height = cvRound(r.height*0.8);
rectangle(showed_image, r.tl()*2, r.br()*2, Scalar(0, 255, 0), 3);
cout << '\a';
}
}