1

I have some callback called in own Thread (not main) continously, as this is 1920 x 1088 (yep, 88), 30 fps video:

@Override
public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer, 
                              final int width, final int height) {

from mediaFormat I can guess colorFormat: COLOR_FormatYUV420SemiPlanar or COLOR_FormatYUV420Planar (I would prefer to support both, but at least one of these)

now I want to draw these frames, preferably on TextureView, but might be SurfaceView also, as conversion to RGB/Bitmap isn't efficient, in my case may take even 60+ ms (and thats 30 fps video...), so I should stick to some "native way" or "GPU way" (right?)

WebRTC way

I've found WebRTC lib turning out very helpful, it contains some breadcrumbs for my rendering case, but I can't achieve "video", only first frame is drawn (properly, no issues)

    int rowStrideY = width;
    int rowStrideU = width / 2;
    int rowStrideV = width / 2;

    // TODO asjust to ColorFormat
    int basicOffset = byteBuffer.remaining() / 6;
    int offsetY = 0;
    int offsetU = basicOffset * 4;
    int offsetV = basicOffset * 5;

    ByteBuffer i420ByteBuffer = byteBuffer.duplicate();
    i420ByteBuffer.position(offsetY);
    final ByteBuffer dataY = i420ByteBuffer.slice();
    i420ByteBuffer.position(offsetU);
    final ByteBuffer dataU = i420ByteBuffer.slice();
    i420ByteBuffer.position(offsetV);
    final ByteBuffer dataV = i420ByteBuffer.slice();

    JavaI420Buffer javaI420Buffer = JavaI420Buffer.wrap(width, height,
            dataY, rowStrideY,
            dataU, rowStrideU,
            dataV, rowStrideV,
            () -> {
                JniCommon.nativeFreeByteBuffer(i420ByteBuffer);
            });

    VideoFrame frame = new VideoFrame(javaI420Buffer, 0, System.currentTimeMillis());
    surfaceViewRenderer.onFrame(frame);
    //turnOffYuv(); // no crash, but only first frame drawn
}

(some source of this in HERE)

SurfaceViewRenderer when feeded second/further frame will throw

FATAL EXCEPTION: SurfaceViewRendererEglRenderer
   Process: thats.my.package, PID: 12970
   java.lang.IllegalStateException: buffer is inaccessible
    at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
    at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
    at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
    at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
    at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:664)
    at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
    at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
    at android.os.Handler.handleCallback(Handler.java:883)
    at android.os.Handler.dispatchMessage(Handler.java:100)
    at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
    at android.os.Looper.loop(Looper.java:214)
    at android.os.HandlerThread.run(HandlerThread.java:67)

some sources suggest to create VideoTrack, addSink etc. but I've failed to prepare own one, also confused and scared a bit by native methods in code

OpenGL ES way

scaried a bit of native relations of WebRTC I've turned to some other, more "pure" way for my purpose - OpenGL ES and GLSurfaceView. Found THIS Renderer, adjusted a bit and achieved disorted-color-kind-of-stretched video, mostly dark, but very smooth...

onCreateView in Fragment:

mGLSurfaceView = rootView.findViewById(R.id.GLSurfaceView);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new NV21Renderer();
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setPreserveEGLContextOnPause(true);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

frame passing:

@Override
public void onYuvData(MediaFormat mediaFormat, byte[] data, int dataSize, int width, int height) {
    // data[] in here is NV21
    //YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
    // mediaFormat contains "original" colorFormat
    mGLSurfaceView.queueEvent(new Runnable() {
        @Override
        public void run() {
            mRenderer.onPreviewFrame(data);
            mGLSurfaceView.requestRender();
        }
    });
}

renderer (these GLES20. calls are my first OpenGL lines ever):

public class NV21Renderer implements GLSurfaceView.Renderer {
    public static final int recWidth = 1920;
    public static final int recHeight = 1088;

    private static final int LENGTH = recWidth * recHeight;
    private static final int LENGTH_4 = recWidth * recHeight / 4;

    private static final int U_INDEX = LENGTH;
    private static final int V_INDEX = LENGTH + LENGTH_4;

    private int[] yTextureNames;
    private int[] uTextureNames;
    private int[] vTextureNames;

    private final FloatBuffer mVertices;
    private final ShortBuffer mIndices;

    private int mProgramObject;
    private int mPositionLoc;
    private int mTexCoordLoc;

    private int yTexture;
    private int uTexture;
    private int vTexture;

    private final ByteBuffer yBuffer;
    private final ByteBuffer uBuffer;
    private final ByteBuffer vBuffer;

    byte[] ydata = new byte[LENGTH];
    byte[] uData = new byte[LENGTH_4];
    byte[] vData = new byte[LENGTH_4];

    private boolean surfaceCreated = false;
    private boolean dirty = false; // prevent drawing first frame when no data

    public NV21Renderer() {
        mVertices = ByteBuffer.allocateDirect(mVerticesData.length * 4)
                .order(ByteOrder.nativeOrder()).asFloatBuffer();
        mVertices.put(mVerticesData).position(0);

        mIndices = ByteBuffer.allocateDirect(mIndicesData.length * 2)
                .order(ByteOrder.nativeOrder()).asShortBuffer();
        mIndices.put(mIndicesData).position(0);

        yBuffer = ByteBuffer.allocateDirect(LENGTH);
        uBuffer = ByteBuffer.allocateDirect(LENGTH_4);
        vBuffer = ByteBuffer.allocateDirect(LENGTH_4);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Timber.d("onSurfaceCreated");

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);

        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

        final String vShaderStr = vertexShader;
        final String fShaderStr = fragmentShader;
        IntBuffer frameBuffer = IntBuffer.allocate(1);
        IntBuffer renderBuffer = IntBuffer.allocate(1);
        GLES20.glGenFramebuffers(1, frameBuffer);
        GLES20.glGenRenderbuffers(1, renderBuffer);
        GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffer.get(0));
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBuffer.get(0));

        GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16,
                recWidth, recHeight);

        IntBuffer parameterBufferHeigth = IntBuffer.allocate(1);
        IntBuffer parameterBufferWidth = IntBuffer.allocate(1);
        GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_WIDTH, parameterBufferWidth);
        GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_HEIGHT, parameterBufferHeigth);
        GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_RENDERBUFFER, renderBuffer.get(0));
        if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE) {
            Timber.w("gl frame buffer status != frame buffer complete %s",
                    GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER));
        }
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        mProgramObject = loadProgram(vShaderStr, fShaderStr);

        mPositionLoc = GLES20.glGetAttribLocation(mProgramObject, "a_position");
        mTexCoordLoc = GLES20.glGetAttribLocation(mProgramObject, "a_texCoord");

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        yTexture = GLES20.glGetUniformLocation(mProgramObject, "y_texture");
        yTextureNames = new int[1];
        GLES20.glGenTextures(1, yTextureNames, 0);
        int yTextureName = yTextureNames[0];

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        uTexture = GLES20.glGetUniformLocation(mProgramObject, "u_texture");
        uTextureNames = new int[1];
        GLES20.glGenTextures(1, uTextureNames, 0);
        int uTextureName = uTextureNames[0];

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        vTexture = GLES20.glGetUniformLocation(mProgramObject, "v_texture");
        vTextureNames = new int[1];
        GLES20.glGenTextures(1, vTextureNames, 0);
        int vTextureName = vTextureNames[0];

        surfaceCreated = true;
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Timber.d("onSurfaceChanged width:" + width + " height:" + height +
                " surfaceCreated:" + surfaceCreated + "dirty:" + dirty);
        GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
        GLES20.glViewport(0, 0, width, height);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    }

    @Override
    public final void onDrawFrame(GL10 gl) {
        Timber.d("onDrawFrame surfaceCreated:" + surfaceCreated + " dirty:" + dirty);
        if (!surfaceCreated || !dirty) return;

        // Clear the color buffer
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        // Use the program object
        GLES20.glUseProgram(mProgramObject);

        // Load the vertex position
        mVertices.position(0);
        GLES20.glVertexAttribPointer(mPositionLoc, 3, GLES20.GL_FLOAT, false, 5 * 4, mVertices);
        // Load the texture coordinate
        mVertices.position(3);
        GLES20.glVertexAttribPointer(mTexCoordLoc, 2, GLES20.GL_FLOAT, false, 5 * 4, mVertices);

        GLES20.glEnableVertexAttribArray(mPositionLoc);
        GLES20.glEnableVertexAttribArray(mTexCoordLoc);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth, recHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, yBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
        GLES20.glUniform1i(yTexture, 0);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth / 2, recHeight / 2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, uBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 2);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
        GLES20.glUniform1i(uTexture, 2);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth / 2, recHeight / 2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, vBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 1);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
        GLES20.glUniform1i(vTexture, 1);

        GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_SHORT, mIndices);

        dirty = false;
    }

    private int loadShader(int type, String shaderSrc) {
        int shader;
        int[] compiled = new int[1];

        shader = GLES20.glCreateShader(type);
        if (shader == 0) {
            return 0;
        }
        GLES20.glShaderSource(shader, shaderSrc);
        GLES20.glCompileShader(shader);
        GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);

        if (compiled[0] == 0) {
            Timber.d("loadShader %s", GLES20.glGetShaderInfoLog(shader));
            GLES20.glDeleteShader(shader);
            return 0;
        }
        return shader;
    }

    private int loadProgram(String vertShaderSrc, String fragShaderSrc) {
        int vertexShader;
        int fragmentShader;
        int programObject;
        int[] linked = new int[1];

        vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertShaderSrc);
        if (vertexShader == 0) {
            return 0;
        }

        fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragShaderSrc);
        if (fragmentShader == 0) {
            GLES20.glDeleteShader(vertexShader);
            return 0;
        }

        programObject = GLES20.glCreateProgram();

        if (programObject == 0) {
            return 0;
        }

        GLES20.glAttachShader(programObject, vertexShader);
        GLES20.glAttachShader(programObject, fragmentShader);

        GLES20.glLinkProgram(programObject);

        GLES20.glGetProgramiv(programObject, GLES20.GL_LINK_STATUS, linked, 0);

        if (linked[0] == 0) {
            Timber.e("Error linking program:%s", GLES20.glGetProgramInfoLog(programObject));
            GLES20.glDeleteProgram(programObject);
            return 0;
        }

        GLES20.glDeleteShader(vertexShader);
        GLES20.glDeleteShader(fragmentShader);

        return programObject;
    }

    public void onPreviewFrame(byte[] data) {
        System.arraycopy(data, 0, ydata, 0, LENGTH);
        yBuffer.put(ydata);
        yBuffer.position(0);

        System.arraycopy(data, U_INDEX, uData, 0, LENGTH_4);
        uBuffer.put(uData);
        uBuffer.position(0);

        System.arraycopy(data, V_INDEX, vData, 0, LENGTH_4);
        vBuffer.put(vData);
        vBuffer.position(0);

        dirty = true;
    }

    private static final String vertexShader =
            "attribute vec4 a_position;                         \n" +
                    "attribute vec2 a_texCoord;                         \n" +
                    "varying vec2 v_texCoord;                           \n" +

                    "void main(){                                       \n" +
                    "   gl_Position = a_position;                       \n" +
                    "   v_texCoord = a_texCoord;                        \n" +
                    "}                                                  \n";

    private static final String fragmentShader =
            "#ifdef GL_ES                                       \n" +
                    "precision highp float;                             \n" +
                    "#endif                                             \n" +

                    "varying vec2 v_texCoord;                           \n" +
                    "uniform sampler2D y_texture;                       \n" +
                    "uniform sampler2D u_texture;                       \n" +
                    "uniform sampler2D v_texture;                       \n" +

                    "void main (void){                                  \n" +
                    "   float r, g, b, y, u, v;                         \n" +

                    //We had put the Y values of each pixel to the R,G,B components by GL_LUMINANCE,
                    //that's why we're pulling it from the R component, we could also use G or B
                    //see https://stackoverflow.com/questions/12130790/yuv-to-rgb-conversion-by-fragment-shader/17615696#17615696
                    //and https://stackoverflow.com/questions/22456884/how-to-render-androids-yuv-nv21-camera-image-on-the-background-in-libgdx-with-o
                    "   y = texture2D(y_texture, v_texCoord).r;         \n" +

                    //Since we use GL_LUMINANCE, each compoentn it on it own map
                    "   u = texture2D(u_texture, v_texCoord).r - 0.5;  \n" +
                    "   v = texture2D(v_texture, v_texCoord).r - 0.5;  \n" +


                    //The numbers are just YUV to RGB conversion constants
                    "   r = y + 1.13983*v;                              \n" +
                    "   g = y - 0.39465*u - 0.58060*v;                  \n" +
                    "   b = y + 2.03211*u;                              \n" +

                    //We finally set the RGB color of our pixel
                    "   gl_FragColor = vec4(r, g, b, 1.0);              \n" +
                    "}                                                  \n";

    private static final float[] mVerticesData = {
            -1.f, 1.f, 0.0f, // Position 0
            0.0f, 0.0f, // TexCoord 0
            -1.f, -1.f, 0.0f, // Position 1
            0.0f, 1.0f, // TexCoord 1
            1.f, -1.f, 0.0f, // Position 2
            1.0f, 1.0f, // TexCoord 2
            1.f, 1.f, 0.0f, // Position 3
            1.0f, 0.0f // TexCoord 3
    };
    private static final short[] mIndicesData = {0, 1, 2, 0, 2, 3};
}

on some Android 10 works, but gives very dark and stretched video, colors mainly green and pink/red)

but on Android 13 (Pixel) I'm getting always

signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x00000000

thrown in onSurfaceCreated. is it misconfigured somehow...?

so: How to draw YUV/NV21, from simple byte array/buffer to picture/video on screen?

PS. YUV stream/callback is fine, I can encode it with e.g. h264 and drop to mp4 file or stream out, no issues, or inspect single frame with jpeg generated by YuvImage. I just want to draw it real-time aka. "preview"..

snachmsm
  • 17,866
  • 3
  • 32
  • 74
  • Take a look at this post. He uses different factors to calculate r, g, b in the shader. Maybe this fixes your color problem: https://stackoverflow.com/a/13097718/11016652 – zomega Mar 17 '23 at 17:48
  • Please debug your app and tell us which command exactly in `onSurfaceCreated` causes the `SIGSEGV`. – zomega Mar 17 '23 at 17:49

1 Answers1

0

this will work with ColorFormat COLOR_FormatYUV420Planar on both mentioned Android versions. mostly copy paste from THIS QUESTION, huge kudos!

public class Yuv420PlanarRenderer implements GLSurfaceView.Renderer {
    int[] mTextureIds = new int[3];
    float[] mScaleMatrix = new float[16];

    private final FloatBuffer mVertexBuffer;
    private final FloatBuffer mTextureBuffer;
    private final ShortBuffer mDrawListBuffer;

    boolean mVideoFitEnabled = true;
    boolean mVideoDisabled = false;

    // number of coordinates per vertex in this array
    static final int COORDS_PER_VERTEX = 3;
    static final int TEXTURECOORDS_PER_VERTEX = 2;

    private static final float[] mXYZCoords = {
            -1.0f, 1.0f, 0.0f, // top left
            -1.0f, -1.0f, 0.0f, // bottom left
            1.0f, -1.0f, 0.0f, // bottom right
            1.0f, 1.0f, 0.0f // top right
    };

    private static final float[] mUVCoords = {
            0, 0, // top left
            0, 1, // bottom left
            1, 1, // bottom right
            1, 0  // top right
    };

    private final short[] mVertexIndex = {0, 1, 2, 0, 2, 3}; // order to draw vertices

    private static final String vertexShaderCode =

            "uniform mat4 uMVPMatrix;"
                    + "attribute vec4 aPosition;\n"
                    + "attribute vec2 aTextureCoord;\n"
                    + "varying vec2 vTextureCoord;\n"

                    + "void main() {\n"
                    + "  gl_Position = uMVPMatrix * aPosition;\n"
                    + "  vTextureCoord = aTextureCoord;\n"
                    + "}\n";


    private static final String fragmentShaderCode =

            "precision mediump float;\n"
                    + "uniform sampler2D Ytex;\n"
                    + "uniform sampler2D Utex,Vtex;\n"
                    + "varying vec2 vTextureCoord;\n"

                    + "void main(void) {\n"
                    + "  float nx,ny,r,g,b,y,u,v;\n"
                    + "  mediump vec4 txl,ux,vx;"
                    + "  nx=vTextureCoord[0];\n"
                    + "  ny=vTextureCoord[1];\n"

                    + "  y=texture2D(Ytex,vec2(nx,ny)).r;\n"
                    + "  u=texture2D(Utex,vec2(nx,ny)).r;\n"
                    + "  v=texture2D(Vtex,vec2(nx,ny)).r;\n"

                    + "  y=1.1643*(y-0.0625);\n"
                    + "  u=u-0.5;\n"
                    + "  v=v-0.5;\n"

                    + "  r=y+1.5958*v;\n"
                    + "  g=y-0.39173*u-0.81290*v;\n"
                    + "  b=y+2.017*u;\n"

                    + "  gl_FragColor=vec4(r,g,b,1.0);\n"
                    + "}\n";


    private final ReentrantLock mFrameLock = new ReentrantLock();
    private Frame mCurrentFrame;

    private int mProgram;
    private int mTextureWidth;
    private int mTextureHeight;
    private int mViewportWidth;
    private int mViewportHeight;

    public Yuv420PlanarRenderer() {
        ByteBuffer bb = ByteBuffer.allocateDirect(mXYZCoords.length * 4);
        bb.order(ByteOrder.nativeOrder());
        mVertexBuffer = bb.asFloatBuffer();
        mVertexBuffer.put(mXYZCoords);
        mVertexBuffer.position(0);

        ByteBuffer tb = ByteBuffer.allocateDirect(mUVCoords.length * 4);
        tb.order(ByteOrder.nativeOrder());
        mTextureBuffer = tb.asFloatBuffer();
        mTextureBuffer.put(mUVCoords);
        mTextureBuffer.position(0);

        ByteBuffer dlb = ByteBuffer.allocateDirect(mVertexIndex.length * 2);
        dlb.order(ByteOrder.nativeOrder());
        mDrawListBuffer = dlb.asShortBuffer();
        mDrawListBuffer.put(mVertexIndex);
        mDrawListBuffer.position(0);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

        int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
        int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);

        mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
        GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
        GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
        GLES20.glLinkProgram(mProgram);

        int positionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
        int textureHandle = GLES20.glGetAttribLocation(mProgram, "aTextureCoord");

        GLES20.glVertexAttribPointer(positionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, COORDS_PER_VERTEX * 4, mVertexBuffer);
        GLES20.glEnableVertexAttribArray(positionHandle);
        GLES20.glVertexAttribPointer(textureHandle, TEXTURECOORDS_PER_VERTEX, GLES20.GL_FLOAT, false, TEXTURECOORDS_PER_VERTEX * 4, mTextureBuffer);
        GLES20.glEnableVertexAttribArray(textureHandle);
        GLES20.glUseProgram(mProgram);

        int i = GLES20.glGetUniformLocation(mProgram, "Ytex");                                     //            GLES20.glUniform3i(i, 0, 1, 2);
        GLES20.glUniform1i(i, 0); /* Bind Ytex to texture unit 0 */

        i = GLES20.glGetUniformLocation(mProgram, "Utex");
        GLES20.glUniform1i(i, 1); /* Bind Utex to texture unit 1 */

        i = GLES20.glGetUniformLocation(mProgram, "Vtex");
        GLES20.glUniform1i(i, 2); /* Bind Vtex to texture unit 2 */

        mTextureWidth = 0;
        mTextureHeight = 0;
    }

    private static void initializeTexture(int name, int id, int width, int height) {
        GLES20.glActiveTexture(name);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, id);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, null);
    }

    private void setupTextures(Frame frame) {
        if (mTextureIds[0] != 0) {
            GLES20.glDeleteTextures(3, mTextureIds, 0);
        }

        GLES20.glGenTextures(3, mTextureIds, 0);

        int w = frame.getWidth();
        int h = frame.getHeight();
        int hw = (w + 1) >> 1;
        int hh = (h + 1) >> 1;

        initializeTexture(GLES20.GL_TEXTURE0, mTextureIds[0], w, h);
        initializeTexture(GLES20.GL_TEXTURE1, mTextureIds[1], hw, hh);
        initializeTexture(GLES20.GL_TEXTURE2, mTextureIds[2], hw, hh);

        mTextureWidth = frame.getWidth();
        mTextureHeight = frame.getHeight();
    }

    private void updateTextures(Frame frame) {
        int width = frame.getWidth();
        int height = frame.getHeight();
        int half_width = (width + 1) >> 1;
        int half_height = (height + 1) >> 1;
        int y_size = width * height;
        int uv_size = half_width * half_height;

        ByteBuffer bb = frame.getBuffer();
        bb.clear();  // If we are reusing this frame, make sure we reset position and limit

        // TODO handle different frame.colorFormat than COLOR_FormatYUV420Planar
        if (bb.remaining() == y_size + uv_size * 2) {
            bb.position(0);

            GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1);
            GLES20.glPixelStorei(GLES20.GL_PACK_ALIGNMENT, 1);

            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureIds[0]);
            GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, width, height, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, bb);

            bb.position(y_size);

            GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureIds[1]);
            GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, half_width, half_height, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, bb);

            bb.position(y_size + uv_size);

            GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureIds[2]);
            GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, half_width, half_height, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, bb);


            int i = GLES20.glGetUniformLocation(mProgram, "width");
            GLES20.glUniform1f(i, (float) mTextureWidth);

            i = GLES20.glGetUniformLocation(mProgram, "height");
            GLES20.glUniform1f(i, (float) mTextureHeight);
        } else {
            mTextureWidth = 0;
            mTextureHeight = 0;
        }
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        GLES20.glViewport(0, 0, width, height);
        mViewportWidth = width;
        mViewportHeight = height;
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        mFrameLock.lock();

        if (mCurrentFrame != null && !mVideoDisabled) {
            GLES20.glUseProgram(mProgram);

            if (mTextureWidth != mCurrentFrame.getWidth() || mTextureHeight != mCurrentFrame.getHeight()) {
                setupTextures(mCurrentFrame);
            }

            updateTextures(mCurrentFrame);

            Matrix.setIdentityM(mScaleMatrix, 0);
            float scaleX = 1.0f, scaleY = 1.0f;
            float ratio = (float) mCurrentFrame.getWidth() / mCurrentFrame.getHeight();
            float vratio = (float) mViewportWidth / mViewportHeight;

            if (mVideoFitEnabled) {
                if (ratio > vratio) {
                    scaleY = vratio / ratio;
                } else {
                    scaleX = ratio / vratio;
                }
            } else {
                if (ratio < vratio) {
                    scaleY = vratio / ratio;
                } else {
                    scaleX = ratio / vratio;
                }
            }

            Matrix.scaleM(mScaleMatrix, 0, scaleX * (mCurrentFrame.isMirroredX() ? -1.0f : 1.0f), scaleY, 1);

            int mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
            GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mScaleMatrix, 0);

            GLES20.glDrawElements(GLES20.GL_TRIANGLES, mVertexIndex.length, GLES20.GL_UNSIGNED_SHORT, mDrawListBuffer);
        }

        mFrameLock.unlock();
    }

    public void displayFrame(Frame frame) {
        mFrameLock.lock();
        if (this.mCurrentFrame != null) {
            this.mCurrentFrame.recycle();
        }

        this.mCurrentFrame = frame;
        mFrameLock.unlock();
    }

    public static int loadShader(int type, String shaderCode) {
        int shader = GLES20.glCreateShader(type);

        GLES20.glShaderSource(shader, shaderCode);
        GLES20.glCompileShader(shader);

        return shader;
    }

    public void disableVideo(boolean b) {
        mFrameLock.lock();

        mVideoDisabled = b;

        if (mVideoDisabled) {
            if (this.mCurrentFrame != null) {
                this.mCurrentFrame.recycle();
            }

            this.mCurrentFrame = null;
        }

        mFrameLock.unlock();
    }

    public void enableVideoFit(boolean enableVideoFit) {
        mVideoFitEnabled = enableVideoFit;
    }

    public static class Frame {

        public boolean isMirrored = false;
        public int width, height, colorFormat;
        public ByteBuffer bb;

        public Frame(ByteBuffer bb, int colorFormat, int width, int height) {
            this.width = width;
            this.height = height;
            this.colorFormat = colorFormat;
            this.bb = bb;
        }

        public int getWidth() {
            return width;
        }

        public int getHeight() {
            return height;
        }

        public ByteBuffer getBuffer() {
            return bb;
        }

        public void recycle() {
            // just reset params, but don't in fact recycle
            // stream or image processing/storing may need this buffer
            bb.clear();
        }

        public boolean isMirroredX() {
            return isMirrored;
        }
    }
}

initialization:

mGLSurfaceView = rootView.findViewById(R.id.previewGLSurfaceView);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new Yuv420PlanarRenderer();
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setPreserveEGLContextOnPause(true);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

frame passing:

mGLSurfaceView.queueEvent(new Runnable() {
    @Override
    public void run() {
        // colorFormat for future impl
        mRenderer.displayFrame(new Yuv420PlanarRenderer.Frame(data, colorFormat, width, height));
        mGLSurfaceView.requestRender();
    }
});

any better approaches (besides answer for linked question-source)? or mod for COLOR_FormatYUV420SemiPlanar or NV21 aka YuvFrame handling option instead of custom Frame

snachmsm
  • 17,866
  • 3
  • 32
  • 74