1

I'm trying to efficiently do color conversion from I420 to rgb to implement a video player in android.

It has been stated that glTexSubImage2D() and glTexImage2D() are too slow so I'm trying to use EGL Image extensions.

Basically I'm following this example to load everything.

gl2_yuvtext

And then the problem arise when I want to pass the decoded frame from ffmpeg to the GraphicBuffer.

This is the function I'm calling every time I get a decoded frame:

queueBuffer(frame->data, frame->linesize[0], frame->linesize[1]);

This is the relevant code:

struct OpenGLData{

EGLDisplay dpy;
EGLContext context;
EGLSurface surface;
EGLImageKHR img;
EGLNativeWindowType window;

GLuint yuvTex;
GLuint gProgram;
GLint gvPositionHandle;
GLint gYuvTexSamplerHandle;

};


void queueBuffer(uint8_t** source, int width, int height){


sp<GraphicBuffer> yuvTexBuffer = new GraphicBuffer(width, height, HAL_PIXEL_FORMAT_YV12, GraphicBuffer::USAGE_HW_TEXTURE | GraphicBuffer::USAGE_SW_WRITE_OFTEN);

uint8_t* buf = NULL;
status_t err = yuvTexBuffer->lock(GRALLOC_USAGE_SW_WRITE_OFTEN, (void**)(&buf));

if (err != 0) {
    LOGE(2, "yuvTexBuffer->lock(...) failed: %d\n", err);
    return;
}

copyI420Buffer(source, buf, width, height, yuvTexBuffer->getStride());
yuvTexBuffer->unlock();

EGLClientBuffer clientBuffer = (EGLClientBuffer)yuvTexBuffer->getNativeBuffer();
openGLData->img = eglCreateImageKHR(openGLData->dpy, EGL_NO_CONTEXT, EGL_NATIVE_BUFFER_ANDROID, clientBuffer, 0);
checkEglError("eglCreateImageKHR");
glGenTextures(1, &openGLData->yuvTex);
checkGlError("glGenTextures");
glBindTexture(GL_TEXTURE_EXTERNAL_OES, openGLData->yuvTex);
checkGlError("glBindTexture");
glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, (GLeglImageOES)openGLData->img);
checkGlError("glEGLImageTargetTexture2DOES");

}

I420 to YV12 function:

void copyI420Buffer(uint8_t** src, uint8_t* dst,int srcWidth, int srcHeight, int stride) {
int strideUV = (stride / 2 + 0xf) & ~0xf;

// Y
for (int i = srcHeight; i > 0; i--) {
    memcpy(dst, src[0], srcWidth);
    dst += stride;
    src[0] += srcWidth;
}

// The src is I420, the dst is YV12.
// U

for (int i = srcHeight / 2; i > 0; i--) {
    memcpy(dst, src[1], srcWidth / 2);
    dst += strideUV;
    src[1] += srcWidth / 2;
}

// V

for (int i = srcHeight / 2; i > 0; i--) {
    memcpy(dst, src[2], srcWidth / 2);
    dst += strideUV;
    src[2] += srcWidth / 2;
}

}

And then I call renderFrame routine:

void renderFrame(){

glClearColor(0.0f, 0.0f, 1.0f, 1.0f);
checkGlError("glClearColor");
glClear( GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT);
checkGlError("glClear");

glUseProgram(openGLData->gProgram);
checkGlError("glUseProgram");

glVertexAttribPointer(openGLData->gvPositionHandle, 2, GL_FLOAT, GL_FALSE, 0, gTriangleVertices);
checkGlError("glVertexAttribPointer");
glEnableVertexAttribArray(openGLData->gvPositionHandle);
checkGlError("glEnableVertexAttribArray");


glUniform1i(openGLData->gYuvTexSamplerHandle, 0);
checkGlError("glUniform1i");
glBindTexture(GL_TEXTURE_EXTERNAL_OES, openGLData->yuvTex);
checkGlError("glBindTexture");

glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
checkGlError("glDrawArrays");

eglSwapBuffers(openGLData->dpy, openGLData->surface);
checkEglError("eglSwapBuffers");
}

But I only get EGL_BAD_CONTEXT after eglSwapBuffers and GL_INVALID_VALUE.

I would appreciate any helps. Thanks.

  • Shouldn't the GraphicBuffer alloc size be based on `stride` rather than `width`? Does everything succeed before the eglSwapBuffers()? Do you see the effects of the glClear()? You're using some non-public stuff so this may break in the future. What's the encoding format of the video source? – fadden Jan 31 '14 at 06:32
  • I know I'm using a private api, but the public one glTexSubImage2D() and glTexImage2D() are too slow for this purpose. Yes, everything succeed before eglswapBuffers. I only get a black screen and lot of eglswapbuffer error and : GL_INVALID_VALUE error in console. It doesn't matter the encoding format of the video source as I'm trying to render the decoded frames, which come in yuv420 format. I'm rendering to a SurfaceView which needs rgba format, I was doing this before using libyuv which worked great but I want to improve it a do it using shaders. Thanks – Gorilla.Maguila Jan 31 '14 at 10:42
  • I asked about the source format in case there were options you hadn't considered. Anyway, if you're seeing `glUseProgram` complaints in the log it's because `glUseProgram` is failing, and I'm curious that you're not seeing your own errors flagged... and it's making me wonder if your `checkGlError` is working. Do you have `printf()` redirected to the log? (And to answer my previous question: as in gl2_yuvtex.cpp, `stride` should come *from* the GraphicBuffer; if your input has a non-width stride, the two need to be reconciled. Not yet important unless you're writing off the end of the buf.) – fadden Jan 31 '14 at 15:47
  • You are right about the stride I have updated the code above yuvTexBuffer->getStride(). I'm sure that checkGlError is working as it's giving me the glUseProgram error GL_INVALID_VALUE. – Gorilla.Maguila Jan 31 '14 at 16:13
  • `` looks like a GL driver message from a device with a QCOM SoC (it's probably tagged "Adreno-ES20" or some such). Do you also get one from checkGlError()? What I'm driving at is I'm concerned that there are earlier unreported failures, and if you could identify those you'd have a better chance of fixing what's broken. For example, if the program is invalid, it means program creation failed, which suggests that one of your shaders didn't compile, and you should see a verbose message explaining the compilation problem from your `createProgram()`. – fadden Jan 31 '14 at 17:33
  • You are right about the "ADreno-ES20". I've tracked down the error > GL_INVALID_VALUE it's triggered when I call glDeleteProgram() before exiting. The main error I get W/Adreno-EGL(12427):: EGL_BAD_CONTEXT 0x3006 after eglswapBufefrs() So I suppose the program creation is not failing any error. – Gorilla.Maguila Jan 31 '14 at 18:16
  • Can you confirm that there is no error pending if you call `checkEglError()` *before* the `eglSwapBuffers()`? Again, just want to make sure we're looking at the right thing (in case there's something failing between `eglCreateImageKHR()` and `eglSwapBuffers()`). Bottom line is there's nothing obviously wrong with the code you're showing, so I think you're going to have to start with something that works and then gradually make your changes, testing at each forward step, to figure out what's making it break. – fadden Jan 31 '14 at 18:48
  • I'm stuck. There is no error before eglSwapBuffers(). Maybe the error comes from the anativewindow I'm passing. Instead of android_createDisplaySurface() I'm getting the anativewindow from AnativeWindow_fromSurface() – Gorilla.Maguila Feb 01 '14 at 20:20
  • Unless I've misunderstood your intentions, you should be able to set up your display surface and EGL context from Java, and just use your native code to prepare the external texture. Once everything is in GraphicBuffers you're just passing buffer handles around and you're not going to gain any performance by doing stuff in C++. There's a wealth of Java-language example code that works with external YUV textures (from the video decoder and camera) -- see http://bigflake.com/mediacodec/ and https://github.com/google/grafika . – fadden Feb 01 '14 at 23:37
  • Finally the EGL_BAD_CONTEXT was due to my code was executing calls to the EGL and OpenGL ES from more than a single thread. Thanks for your support. – Gorilla.Maguila Feb 02 '14 at 11:22
  • Please see my answer here, CPUImage version has great performance: http://stackoverflow.com/questions/9325861/converting-yuv-rgbimage-processing-yuv-during-onpreviewframe-in-android/36565200#36565200 – promenade Apr 12 '16 at 06:38

0 Answers0