I am trying to display a stream of frames received over network and display them to TextureView. My pipeline is as follows:
- Receive video using GStreamer. I am using NDK. Gstreamer code is in C. I am using JNI callback to send individual frames received in appsink from C to Java. I do not want to use ANativeWindow from within the NDK to display to surface, as is done in the GStreamer Tutorial-3 example app.
- In Java, these frames are added to a ArrayBlockingQueue. A separate thread pulls from this queue.
Following is the callback from pullFromQueue thread stays alive as long as app is running. The byte[] frame is a NV21 format frame of known width and height.
@DebugLog
private void pullFromQueueMethod() {
try {
long start = System.currentTimeMillis();
byte frame[] = framesQueue.take();
}
From here, I would like to use OpenGL to alter brightness, contrast and apply shaders to individual frames. Performance is of utmost concern to me and hence I cannot convert byte[] to Bitmap and then display to a SurfaceView. I have tried this and it takes nearly 50ms for a 768x576 frame on Nexus 5.
Surprisingly, I cannot find an example anywhere to do the same. All examples use either the Camera or MediaPlayer inbuilt functions to direct their preview to surface/texture. For example : camera.setPreviewTexture(surfaceTexture);
. This links the output to a SurfaceTexture and hence you never have to handle displaying individual frames (never have to deal with byte arrays).
What I have attempted so far :
Seen this answer on StackOverflow. It suggests to use Grafika's createImageTexture()
. Once I receive a Texture handle, how do I pass this to SurfaceTexture and continuously update it? Here is partial code of what I've implemented so far :
public class CameraActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener {
int textureId = -1;
SurfaceTexture surfaceTexture;
TextureView textureView;
...
protected void onCreate(Bundle savedInstanceState) {
textureView = new TextureView(this);
textureView.setSurfaceTextureListener(this);
}
private void pullFromQueueMethod() {
try {
long start = System.currentTimeMillis();
byte frame[] = framesQueue.take();
if (textureId == -1){
textureId = GlUtil.createImageTexture(frame);
surfaceTexture = new SurfaceTexture(textureId);
textureView.setSurfaceTexture(surfaceTexture);
} else {
GlUtil.updateImageTexture(textureId); // self defined method that doesn't create a new texture, but calls GLES20.glTexImage2D() to update that texture
}
surfaceTexture.updateTexImage();
/* What do I do from here? Is this the right approach? */
}
}
To sum up. All I really need is an efficient manner to display a stream of frames (byte arrays). How do I achieve this?