I have some callback called in own Thread
(not main) continously, as this is 1920 x 1088 (yep, 88), 30 fps video:
@Override
public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer,
final int width, final int height) {
from mediaFormat
I can guess colorFormat
: COLOR_FormatYUV420SemiPlanar
or COLOR_FormatYUV420Planar
(I would prefer to support both, but at least one of these)
now I want to draw these frames, preferably on TextureView
, but might be SurfaceView
also, as conversion to RGB/Bitmap
isn't efficient, in my case may take even 60+ ms (and thats 30 fps video...), so I should stick to some "native way" or "GPU way" (right?)
WebRTC way
I've found WebRTC
lib turning out very helpful, it contains some breadcrumbs for my rendering case, but I can't achieve "video", only first frame is drawn (properly, no issues)
int rowStrideY = width;
int rowStrideU = width / 2;
int rowStrideV = width / 2;
// TODO asjust to ColorFormat
int basicOffset = byteBuffer.remaining() / 6;
int offsetY = 0;
int offsetU = basicOffset * 4;
int offsetV = basicOffset * 5;
ByteBuffer i420ByteBuffer = byteBuffer.duplicate();
i420ByteBuffer.position(offsetY);
final ByteBuffer dataY = i420ByteBuffer.slice();
i420ByteBuffer.position(offsetU);
final ByteBuffer dataU = i420ByteBuffer.slice();
i420ByteBuffer.position(offsetV);
final ByteBuffer dataV = i420ByteBuffer.slice();
JavaI420Buffer javaI420Buffer = JavaI420Buffer.wrap(width, height,
dataY, rowStrideY,
dataU, rowStrideU,
dataV, rowStrideV,
() -> {
JniCommon.nativeFreeByteBuffer(i420ByteBuffer);
});
VideoFrame frame = new VideoFrame(javaI420Buffer, 0, System.currentTimeMillis());
surfaceViewRenderer.onFrame(frame);
//turnOffYuv(); // no crash, but only first frame drawn
}
(some source of this in HERE)
SurfaceViewRenderer
when feeded second/further frame will throw
FATAL EXCEPTION: SurfaceViewRendererEglRenderer
Process: thats.my.package, PID: 12970
java.lang.IllegalStateException: buffer is inaccessible
at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:664)
at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
at android.os.Handler.handleCallback(Handler.java:883)
at android.os.Handler.dispatchMessage(Handler.java:100)
at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
at android.os.Looper.loop(Looper.java:214)
at android.os.HandlerThread.run(HandlerThread.java:67)
some sources suggest to create VideoTrack
, addSink
etc. but I've failed to prepare own one, also confused and scared a bit by native methods in code
OpenGL ES way
scaried a bit of native relations of WebRTC I've turned to some other, more "pure" way for my purpose - OpenGL ES and GLSurfaceView
. Found THIS Renderer
, adjusted a bit and achieved disorted-color-kind-of-stretched video, mostly dark, but very smooth...
onCreateView
in Fragment
:
mGLSurfaceView = rootView.findViewById(R.id.GLSurfaceView);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new NV21Renderer();
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setPreserveEGLContextOnPause(true);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
frame passing:
@Override
public void onYuvData(MediaFormat mediaFormat, byte[] data, int dataSize, int width, int height) {
// data[] in here is NV21
//YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
// mediaFormat contains "original" colorFormat
mGLSurfaceView.queueEvent(new Runnable() {
@Override
public void run() {
mRenderer.onPreviewFrame(data);
mGLSurfaceView.requestRender();
}
});
}
renderer (these GLES20.
calls are my first OpenGL lines ever):
public class NV21Renderer implements GLSurfaceView.Renderer {
public static final int recWidth = 1920;
public static final int recHeight = 1088;
private static final int LENGTH = recWidth * recHeight;
private static final int LENGTH_4 = recWidth * recHeight / 4;
private static final int U_INDEX = LENGTH;
private static final int V_INDEX = LENGTH + LENGTH_4;
private int[] yTextureNames;
private int[] uTextureNames;
private int[] vTextureNames;
private final FloatBuffer mVertices;
private final ShortBuffer mIndices;
private int mProgramObject;
private int mPositionLoc;
private int mTexCoordLoc;
private int yTexture;
private int uTexture;
private int vTexture;
private final ByteBuffer yBuffer;
private final ByteBuffer uBuffer;
private final ByteBuffer vBuffer;
byte[] ydata = new byte[LENGTH];
byte[] uData = new byte[LENGTH_4];
byte[] vData = new byte[LENGTH_4];
private boolean surfaceCreated = false;
private boolean dirty = false; // prevent drawing first frame when no data
public NV21Renderer() {
mVertices = ByteBuffer.allocateDirect(mVerticesData.length * 4)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mVertices.put(mVerticesData).position(0);
mIndices = ByteBuffer.allocateDirect(mIndicesData.length * 2)
.order(ByteOrder.nativeOrder()).asShortBuffer();
mIndices.put(mIndicesData).position(0);
yBuffer = ByteBuffer.allocateDirect(LENGTH);
uBuffer = ByteBuffer.allocateDirect(LENGTH_4);
vBuffer = ByteBuffer.allocateDirect(LENGTH_4);
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Timber.d("onSurfaceCreated");
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
final String vShaderStr = vertexShader;
final String fShaderStr = fragmentShader;
IntBuffer frameBuffer = IntBuffer.allocate(1);
IntBuffer renderBuffer = IntBuffer.allocate(1);
GLES20.glGenFramebuffers(1, frameBuffer);
GLES20.glGenRenderbuffers(1, renderBuffer);
GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffer.get(0));
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBuffer.get(0));
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16,
recWidth, recHeight);
IntBuffer parameterBufferHeigth = IntBuffer.allocate(1);
IntBuffer parameterBufferWidth = IntBuffer.allocate(1);
GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_WIDTH, parameterBufferWidth);
GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_HEIGHT, parameterBufferHeigth);
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_RENDERBUFFER, renderBuffer.get(0));
if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE) {
Timber.w("gl frame buffer status != frame buffer complete %s",
GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER));
}
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
mProgramObject = loadProgram(vShaderStr, fShaderStr);
mPositionLoc = GLES20.glGetAttribLocation(mProgramObject, "a_position");
mTexCoordLoc = GLES20.glGetAttribLocation(mProgramObject, "a_texCoord");
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
yTexture = GLES20.glGetUniformLocation(mProgramObject, "y_texture");
yTextureNames = new int[1];
GLES20.glGenTextures(1, yTextureNames, 0);
int yTextureName = yTextureNames[0];
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
uTexture = GLES20.glGetUniformLocation(mProgramObject, "u_texture");
uTextureNames = new int[1];
GLES20.glGenTextures(1, uTextureNames, 0);
int uTextureName = uTextureNames[0];
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
vTexture = GLES20.glGetUniformLocation(mProgramObject, "v_texture");
vTextureNames = new int[1];
GLES20.glGenTextures(1, vTextureNames, 0);
int vTextureName = vTextureNames[0];
surfaceCreated = true;
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
Timber.d("onSurfaceChanged width:" + width + " height:" + height +
" surfaceCreated:" + surfaceCreated + "dirty:" + dirty);
GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
GLES20.glViewport(0, 0, width, height);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
}
@Override
public final void onDrawFrame(GL10 gl) {
Timber.d("onDrawFrame surfaceCreated:" + surfaceCreated + " dirty:" + dirty);
if (!surfaceCreated || !dirty) return;
// Clear the color buffer
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Use the program object
GLES20.glUseProgram(mProgramObject);
// Load the vertex position
mVertices.position(0);
GLES20.glVertexAttribPointer(mPositionLoc, 3, GLES20.GL_FLOAT, false, 5 * 4, mVertices);
// Load the texture coordinate
mVertices.position(3);
GLES20.glVertexAttribPointer(mTexCoordLoc, 2, GLES20.GL_FLOAT, false, 5 * 4, mVertices);
GLES20.glEnableVertexAttribArray(mPositionLoc);
GLES20.glEnableVertexAttribArray(mTexCoordLoc);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
recWidth, recHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, yBuffer);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
GLES20.glUniform1i(yTexture, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
recWidth / 2, recHeight / 2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, uBuffer);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
GLES20.glUniform1i(uTexture, 2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
recWidth / 2, recHeight / 2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, vBuffer);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
GLES20.glUniform1i(vTexture, 1);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_SHORT, mIndices);
dirty = false;
}
private int loadShader(int type, String shaderSrc) {
int shader;
int[] compiled = new int[1];
shader = GLES20.glCreateShader(type);
if (shader == 0) {
return 0;
}
GLES20.glShaderSource(shader, shaderSrc);
GLES20.glCompileShader(shader);
GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);
if (compiled[0] == 0) {
Timber.d("loadShader %s", GLES20.glGetShaderInfoLog(shader));
GLES20.glDeleteShader(shader);
return 0;
}
return shader;
}
private int loadProgram(String vertShaderSrc, String fragShaderSrc) {
int vertexShader;
int fragmentShader;
int programObject;
int[] linked = new int[1];
vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertShaderSrc);
if (vertexShader == 0) {
return 0;
}
fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragShaderSrc);
if (fragmentShader == 0) {
GLES20.glDeleteShader(vertexShader);
return 0;
}
programObject = GLES20.glCreateProgram();
if (programObject == 0) {
return 0;
}
GLES20.glAttachShader(programObject, vertexShader);
GLES20.glAttachShader(programObject, fragmentShader);
GLES20.glLinkProgram(programObject);
GLES20.glGetProgramiv(programObject, GLES20.GL_LINK_STATUS, linked, 0);
if (linked[0] == 0) {
Timber.e("Error linking program:%s", GLES20.glGetProgramInfoLog(programObject));
GLES20.glDeleteProgram(programObject);
return 0;
}
GLES20.glDeleteShader(vertexShader);
GLES20.glDeleteShader(fragmentShader);
return programObject;
}
public void onPreviewFrame(byte[] data) {
System.arraycopy(data, 0, ydata, 0, LENGTH);
yBuffer.put(ydata);
yBuffer.position(0);
System.arraycopy(data, U_INDEX, uData, 0, LENGTH_4);
uBuffer.put(uData);
uBuffer.position(0);
System.arraycopy(data, V_INDEX, vData, 0, LENGTH_4);
vBuffer.put(vData);
vBuffer.position(0);
dirty = true;
}
private static final String vertexShader =
"attribute vec4 a_position; \n" +
"attribute vec2 a_texCoord; \n" +
"varying vec2 v_texCoord; \n" +
"void main(){ \n" +
" gl_Position = a_position; \n" +
" v_texCoord = a_texCoord; \n" +
"} \n";
private static final String fragmentShader =
"#ifdef GL_ES \n" +
"precision highp float; \n" +
"#endif \n" +
"varying vec2 v_texCoord; \n" +
"uniform sampler2D y_texture; \n" +
"uniform sampler2D u_texture; \n" +
"uniform sampler2D v_texture; \n" +
"void main (void){ \n" +
" float r, g, b, y, u, v; \n" +
//We had put the Y values of each pixel to the R,G,B components by GL_LUMINANCE,
//that's why we're pulling it from the R component, we could also use G or B
//see https://stackoverflow.com/questions/12130790/yuv-to-rgb-conversion-by-fragment-shader/17615696#17615696
//and https://stackoverflow.com/questions/22456884/how-to-render-androids-yuv-nv21-camera-image-on-the-background-in-libgdx-with-o
" y = texture2D(y_texture, v_texCoord).r; \n" +
//Since we use GL_LUMINANCE, each compoentn it on it own map
" u = texture2D(u_texture, v_texCoord).r - 0.5; \n" +
" v = texture2D(v_texture, v_texCoord).r - 0.5; \n" +
//The numbers are just YUV to RGB conversion constants
" r = y + 1.13983*v; \n" +
" g = y - 0.39465*u - 0.58060*v; \n" +
" b = y + 2.03211*u; \n" +
//We finally set the RGB color of our pixel
" gl_FragColor = vec4(r, g, b, 1.0); \n" +
"} \n";
private static final float[] mVerticesData = {
-1.f, 1.f, 0.0f, // Position 0
0.0f, 0.0f, // TexCoord 0
-1.f, -1.f, 0.0f, // Position 1
0.0f, 1.0f, // TexCoord 1
1.f, -1.f, 0.0f, // Position 2
1.0f, 1.0f, // TexCoord 2
1.f, 1.f, 0.0f, // Position 3
1.0f, 0.0f // TexCoord 3
};
private static final short[] mIndicesData = {0, 1, 2, 0, 2, 3};
}
on some Android 10 works, but gives very dark and stretched video, colors mainly green and pink/red)
but on Android 13 (Pixel) I'm getting always
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x00000000
thrown in onSurfaceCreated
. is it misconfigured somehow...?
so: How to draw YUV/NV21, from simple byte array/buffer to picture/video on screen?
PS. YUV stream/callback is fine, I can encode it with e.g. h264 and drop to mp4 file or stream out, no issues, or inspect single frame with jpeg generated by YuvImage
. I just want to draw it real-time aka. "preview"..