4

In simple words, all I need to do is display a live stream of video frames in Android (each frame is YUV420 format). I have a callback function where I receieve individual frames as a byte array. Something that looks like this :

public void onFrameReceived(byte[] frame, int height, int width, int format) {
    // display this frame to surfaceview/textureview.
}

A feasible but slow option is to convert the byte array to a Bitmap and draw to canvas on SurfaceView. In the future, I would ideally like to be able to alter brightness, contrast etc of this frame, and hence am hoping I can use OpenGL-ES for the same. What are my other options to do this efficiently?

Remember, unlike in implementations of Camera or MediaPlayer class, I can't direct my output to a surfaceview/textureview using camera.setPreviewTexture(surfaceTexture); as I am receiving individual frames using Gstreamer in C.

Crearo Rotar
  • 559
  • 7
  • 23
  • I went straight down the route of rendering the YUV frames directly with openGLES. I can post the fragment shader and any other source from my code if it will help. – WLGfx Jan 16 '17 at 09:19
  • How you managed to render those YUV frames is precisely what I need to know! Were you receiving these frames in form of a byte array like i am? (Source code would be helpful, thanks!) – Crearo Rotar Jan 16 '17 at 10:22

2 Answers2

2

I'm using ffmpeg for my project, but the principal for rendering the YUV frame should be the same for yourself.

If a frame, for example, is 756 x 576, then the Y frame will be that size. The U and V frame are half the width and height of the Y frame, so you will have to make sure you account for the size differences.

I don't know about the camera API, but the frames I get from a DVB source have a width and also each line has a stride. Extras pixels at the end of each line in the frame. Just in case yours is the same, then account for this when calculating your texture coordinates.

Adjusting the texture coordinates to account for the width and stride (linesize):

float u = 1.0f / buffer->y_linesize * buffer->wid; // adjust texture coord for edge

The vertex shader I've used takes screen coordinates from 0.0 to 1.0, but you can change these to suit. It also takes in the texture coords and a colour input. I've used the colour input so that I can add fading, etc.

Vertex shader:

#ifdef GL_ES
precision mediump float;
const float c1 = 1.0;
const float c2 = 2.0;
#else
const float c1 = 1.0f;
const float c2 = 2.0f;
#endif

attribute vec4 a_vertex;
attribute vec2 a_texcoord;
attribute vec4 a_colorin;
varying vec2 v_texcoord;
varying vec4 v_colorout;



void main(void)
{
    v_texcoord = a_texcoord;
    v_colorout = a_colorin;

    float x = a_vertex.x * c2 - c1;
    float y = -(a_vertex.y * c2 - c1);

    gl_Position = vec4(x, y, a_vertex.z, c1);
}

The fragment shader which takes three uniform textures, one for each Y, U and V framges and converts to RGB. This also multiplies by the colour passed in from the vertex shader:

#ifdef GL_ES
precision mediump float;
#endif

uniform sampler2D u_texturey;
uniform sampler2D u_textureu;
uniform sampler2D u_texturev;
varying vec2 v_texcoord;
varying vec4 v_colorout;

void main(void)
{
    float y = texture2D(u_texturey, v_texcoord).r;
    float u = texture2D(u_textureu, v_texcoord).r - 0.5;
    float v = texture2D(u_texturev, v_texcoord).r - 0.5;
    vec4 rgb = vec4(y + 1.403 * v,
                    y - 0.344 * u - 0.714 * v,
                    y + 1.770 * u,
                    1.0);
    gl_FragColor = rgb * v_colorout;
}

The vertices used are in:

float   x, y, z;    // coords
float   s, t;       // texture coords
uint8_t r, g, b, a; // colour and alpha

Hope this helps!

EDIT:

For NV12 format you can still use a fragment shader, although I've not tried it myself. It takes in the interleaved UV as a luminance-alpha channel or similar.

See here for how one person has answered this: https://stackoverflow.com/a/22456885/2979092

Community
  • 1
  • 1
WLGfx
  • 1,169
  • 15
  • 31
  • My original question was actually on how these byte frames would get displayed to a surface (the actual mechanism of how the frame would get drawn). Hence I can't accept this as the answer. However, this is helpful! Thanks a lot. The true format that my frames are in is NV12. (The UV components are interleaved). Do you know how I would extract U and V components individually? – Crearo Rotar Jan 22 '17 at 07:38
  • I've added an edit with a reference to the NV12 shader answer. – WLGfx Jan 22 '17 at 10:46
  • 1
    Yes, I ended up using that. Thanks a lot once again! I shall post my solution to how I displayed individual byte frames to textureview efficiently. I will however upvote your answer as it was very helpful, once I have enough reputation to do so! – Crearo Rotar Jan 22 '17 at 13:39
  • Excellent news. I've just found out that I'm going to have to use that method too for NV12 this morning too. – WLGfx Jan 23 '17 at 08:58
  • @CrearoRotar would you mind post your solution how can you achieve that? – Amos Oct 31 '18 at 07:21
2

I took several answers from SO and various articles plus @WLGfx's answer above to come up with this:

I created two byte buffers, one for Y and one for the UV part of the texture. Then converted the byte buffers to textures using

public static int createImageTexture(ByteBuffer data, int width, int height, int format, int textureHandle) {
    if (GLES20.glIsTexture(textureHandle)) {
        return updateImageTexture(data, width, height, format, textureHandle);
    }
    int[] textureHandles = new int[1];

    GLES20.glGenTextures(1, textureHandles, 0);
    textureHandle = textureHandles[0];
    GlUtil.checkGlError("glGenTextures");

    // Bind the texture handle to the 2D texture target.
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);

    // Configure min/mag filtering, i.e. what scaling method do we use if what we're rendering
    // is smaller or larger than the source image.
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    GlUtil.checkGlError("loadImageTexture");

    // Load the data from the buffer into the texture handle.
    GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, format, width, height,
            0, format, GLES20.GL_UNSIGNED_BYTE, data);
    GlUtil.checkGlError("loadImageTexture");

    return textureHandle;
}

Both these textures are then sent as normal 2D textures to the glsl shader:

precision highp float;
varying vec2 vTextureCoord;
uniform sampler2D sTextureY;
uniform sampler2D sTextureUV;
uniform float sBrightnessValue;
uniform float sContrastValue;
void main (void) {
float r, g, b, y, u, v;
    // We had put the Y values of each pixel to the R,G,B components by GL_LUMINANCE,
    // that's why we're pulling it from the R component, we could also use G or B
    y = texture2D(sTextureY, vTextureCoord).r;
    // We had put the U and V values of each pixel to the A and R,G,B components of the
    // texture respectively using GL_LUMINANCE_ALPHA. Since U,V bytes are interspread
    // in the texture, this is probably the fastest way to use them in the shader
    u = texture2D(sTextureUV, vTextureCoord).r - 0.5;
    v = texture2D(sTextureUV, vTextureCoord).a - 0.5;
    // The numbers are just YUV to RGB conversion constants
    r = y + 1.13983*v;
    g = y - 0.39465*u - 0.58060*v;
    b = y + 2.03211*u;
    // setting brightness/contrast
    r = r * sContrastValue + sBrightnessValue;
    g = g * sContrastValue + sBrightnessValue;
    b = b * sContrastValue + sBrightnessValue;
    // We finally set the RGB color of our pixel
    gl_FragColor = vec4(r, g, b, 1.0);
}
Crearo Rotar
  • 559
  • 7
  • 23
  • thanks for your sharing! the Y and UV buffer can be rendered to screen now! but it's very red, and if the uv's .r .a swapped, it's very blue, do you know the reason? – Amos Nov 08 '18 at 09:27
  • Haha, I've always had very green screens, never red or blue :P Are you sure your incoming frame is YUV420? – Crearo Rotar Nov 08 '18 at 11:11
  • ah green... it's interesting. yep the incoming frames are in NV21 format, it's default android preview format... so i mentioned i've tried to swap the uv's .r .a in last comment, to see if NV21 or NV12 affects, but the result is red and blue... anyway did you resolve your green screens? – Amos Nov 09 '18 at 02:27
  • 1
    There were several things really, and since it was almost a year ago i don't remember what exactly it was, but it was to do with the conversions. Things I remember: - ensure you're clamping - double check the conversions. Though the above are for yuv420, it could be - take care of signed-ness. [from my blog, this might be interesting](https://bhardwajrish.blogspot.com/2018/07/javas-primitive-datatypes-are-signed.html) - try different conversions [like here](http://paulbourke.net/dataformats/nv12/) – Crearo Rotar Nov 09 '18 at 06:54
  • 1
    finally i managed to resolve it, just a silly bug in rendering the 2nd texture :p i got the same greenish result as you now, haha! anyway, thank you very much for your help! – Amos Nov 10 '18 at 10:29