0

I am getting frames of video and creating a PixelBuffer from two planar arrays extracted from video frame using the function CVPixelBufferCreateWithPlanarBytes() and I have checked the generated pixelbuffer from CVPixelBufferRef pixelBuffer =CMSampleBufferGetImageBuffer(sampleBuffer) and created pixelbuffer are same , but when i am creating a texture using CVOpenGLESTextureCacheCreateTextureFromImage() it returns error -6683, Anyone please help me to find what goes wrong here, thank you in advance

Mani
  • 17,549
  • 13
  • 79
  • 100
Rani
  • 21
  • 3
  • how did your problem got resolve, i too getting same error and i have used CVPixelBufferCreateWithPlanarBytes(), should i go with the CVPixelBufferCreate() – Amitg2k12 Sep 17 '14 at 13:32

2 Answers2

2

According to the documentation the reason for this error is:

"The pixel buffer is not compatible with OpenGL due to an unsupported buffer size, pixel format, or attribute."

So it means 1 of 2 things:

1: When you are setting up your texture buffer you are using a pixel format not supported in OpenGLES, when calling :

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, data);

2 :If i remember rightly its because the video frame image format isn't compatible with the OpenGLES implementation. If its from the device camera i believe its YCbCr and has to be converted to an image which is RGB / RGBA.

Remember that there are different bits per pixel for various image formats too. So check which are supported by iOS i think most are from the Khronos spec, i use mainly RGBA_8888 for quality ). Though the above example works with any PNG i tend to use when not using PVRTC

This shows how to take a frame and using CGImage convert it to an OpenGLES friendly format on iOS that you can use as a texture.

Convert an UIImage in a texture

Community
  • 1
  • 1
Dev2rights
  • 3,469
  • 3
  • 25
  • 42
  • i am using kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, and creating textures as the apple's example http://developer.apple.com/library/ios/#samplecode/GLCameraRipple/Introduction/Intro.html – Rani Apr 24 '13 at 08:36
2

Instead of creating cvpixelbufferref by CVPixelBufferCreateWithPlanarBytes(), i have tried creating it by CVPixelBufferCreate(), then i copied the copy manually like this ,

baseaddrY=CVPixelBufferGetBaseAddressOfPlane(pixelBuffer,0);
baseaddrUV=CVPixelBufferGetBaseAddressOfPlane(pixelBuffer,1);

memcpy(baseaddrY,dataY,sizeofY);
memcpy(baseaddrUV,dataUV,sizeofUV);

this creates the cvpixelbufferref. this is referred from Anton's answer for the question, CVOpenGLESTextureCacheCreateTextureFromImage returns error 6683

Community
  • 1
  • 1
Rani
  • 21
  • 3
  • 1
    how did your problem got resolve, i too getting same error and i have used CVPixelBufferCreateWithPlanarBytes(), should i go with the CVPixelBufferCreate() – Amitg2k12 Sep 17 '14 at 13:31