Below code converts cv::Mat
to CVPixelBufferRef
CVPixelBufferRef getImageBufferFromMat(cv::Mat matimg) {
cv::cvtColor(matimg, matimg, CV_BGR2BGRA);
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool: YES], kCVPixelBufferMetalCompatibilityKey,
[NSNumber numberWithBool: YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool: YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
[NSNumber numberWithInt: matimg.cols], kCVPixelBufferWidthKey,
[NSNumber numberWithInt: matimg.rows], kCVPixelBufferHeightKey,
[NSNumber numberWithInt: matimg.step[0]], kCVPixelBufferBytesPerRowAlignmentKey,
nil];
CVPixelBufferRef imageBuffer;
CVReturn status = CVPixelBufferCreate(kCFAllocatorMalloc, matimg.cols, matimg.rows, kCVPixelFormatType_32BGRA, (CFDictionaryRef) CFBridgingRetain(options), &imageBuffer) ;
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *base = CVPixelBufferGetBaseAddress(imageBuffer);
memcpy(base, matimg.data, matimg.total() * matimg.elemSize());
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return imageBuffer;
}
The problem is I am getting half the image
After Convertion (i convert CVPixelBufferRef
back to UIImage
and store it using UIImageWriteToSavedPhotosAlbum
just for checking)
Interestingly, the image size of Mat
and CVPixelBufferRef
are the same.
Now, what I did was resizing the image just before memcopy
, where the height is increased by 2 folds
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *base = CVPixelBufferGetBaseAddress(imageBuffer);
cv::resize(matimg, matimg, cv::Size(), 1 , 2);
memcpy(base, matimg.data, matimg.total() * matimg.elemSize());
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
Now the image size is still the same...
I want to badly know what's causing this behavior and I am sure I am missing something...