1

I am attempting to use the GPU to do video processing. I have been so far successful in extracting the frames from a video, and processing them using the GPU. Please keep in mind that i'm a total OpenGL noob.

I have come across the following bottleneck however: getting the finished frame from the GPU and recording it into a video.

I am aware of this example, but it doesn't exactly suit my needs.

My openGL context is javax.microedition.khronos.egl.EGLContext, which prevents me from sharing textures between the one described by Bigflake's example and mine, which contains the texture i wish to write to the video.

Is there any way to feed the MediaCodec encoder data directly from the GPU? From my research it seems that one can only set a surface provided by the encoder itself as a source using this method.

Can I somehow draw my texture onto the surface provided by the encoder?

I will try to provide any additional details upon request.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
Rakatan
  • 451
  • 4
  • 17
  • Do you mean - you do not have control over the EGL context creation step (ie EGL_RECORDABLE_ANDROID flag cannot be added in the current config) ? – Prabindh Mar 22 '16 at 17:39
  • Two different contexts are needed with different configurations. One for extracting and processing frames, and one for writing the new movie file. I can not share textures between the two. – Rakatan Mar 22 '16 at 19:03
  • Why do you need two contexts with different configurations? Do not conflate EGLSurface with Surface, they are independent. – fadden Mar 22 '16 at 20:01
  • @fadden In the EncodeAndMuxTest that i am attempting to use the context is setup using EGL14.eglCreateContext(), returning an android.opengl.EGLContext. In the ExtractMpegFramesTest that i have used for extracting the frames and processing on the GPU, the context is defined like so: (EGL10) EGLContext.getEGL().eglCreateContext, that returns a javax.microedition.khronos.egl.EGLContext. I figure that the first context is needed for the swapBuffers method and EGLExt.eglPresentationTimeANDROID – Rakatan Mar 23 '16 at 08:31
  • Per your reference to a linked example that doesn't fit your needs: you're unable to use the `share_context` parameter of `eglCreateContext` to establish resource sharing between the two contexts? – Tommy Mar 23 '16 at 15:11
  • No, as I've stated above, the two contexts I am creating for each task are different and can't be passed as the share_context parameter. I have tried. – Rakatan Mar 23 '16 at 15:24
  • There are two versions of http://bigflake.com/mediacodec/#ExtractMpegFramesTest , one that uses EGL 1.0, one that uses EGL 1.4. There is no reason you can't use EGL 1.4 for everything. Underneath it's a single GLES driver context; the choice of API version doesn't change that. Create one context with the flags you need for both (e.g. make sure EGL_RECORDABLE_ANDROID is defined) and just use that. – fadden Mar 23 '16 at 16:23
  • Oh, yes i haven't considered the second example. Thank you for the pointer @fadden. I will post the results as soon as I can. – Rakatan Mar 23 '16 at 16:49

1 Answers1

1

I ended up replacing my EGL10 Context with an EGL14 one as @fadden suggested.

I then shared this context with the one created by the CodecInputSurface like in this example.

The most important step was rendering the shared texture to the surface defined by the second context. I did this with the help of this answer.

I hope that this can be of help to anybody else, i will attempt to clarify if requested.

Thanks again to fadden for his help :).

Community
  • 1
  • 1
Rakatan
  • 451
  • 4
  • 17