8

Trying to stream bitmaps over WebRtc. My Capturer class looks approximately like this:

public class BitmapCapturer implements VideoCapturer, VideoSink {
private Capturer capturer;

private int width;
private int height;

private SurfaceTextureHelper textureHelper;
private Context appContext;
@Nullable
private CapturerObserver capturerObserver;

@Override
public void initialize(SurfaceTextureHelper surfaceTextureHelper,
                       Context context, CapturerObserver capturerObserver) {
    if (capturerObserver == null) {
        throw new RuntimeException("capturerObserver not set.");
    } else {
        this.appContext = context;
        this.textureHelper = surfaceTextureHelper;
        this.capturerObserver = capturerObserver;
        this.capturer = new Capturer();
        this.textureHelper.startListening(this);
    }
}

@Override
public void startCapture(int width, int height, int fps) {
    this.width = width;
    this.height = height;

    long start = System.nanoTime();
    textureHelper.setTextureSize(width, height);

    int[] textures = new int[1];
    GLES20.glGenTextures(1, textures, 0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);

    Matrix matrix = new Matrix();
    matrix.preTranslate(0.5f, 0.5f);
    matrix.preScale(1f, -1f);
    matrix.preTranslate(-0.5f, -0.5f);

    YuvConverter yuvConverter = new YuvConverter();
    TextureBufferImpl buffer = new TextureBufferImpl(width, height,
            VideoFrame.TextureBuffer.Type.RGB, textures[0],  matrix,
            textureHelper.getHandler(), yuvConverter, null);

    this.capturerObserver.onCapturerStarted(true);
    this.capturer.startCapture(new ScreenConfig(width, height),
        new CapturerCallback() {
            @Override
            public void onFrame(Bitmap bitmap) {
                textureHelper.getHandler().post(() -> {
                    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
                    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
                    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);

                    long frameTime = System.nanoTime() - start;
                    VideoFrame videoFrame = new VideoFrame(buffer.toI420(), 0, frameTime);
                    capturerObserver.onFrameCaptured(videoFrame);
                    videoFrame.release();
                });
            }
    });
}

@Override
public void onFrame(VideoFrame videoFrame) {
    capturerObserver.onFrameCaptured(videoFrame);
}


@Override
public void stopCapture() throws InterruptedException {

}

@Override
public void changeCaptureFormat(int i, int i1, int i2) {

}

@Override
public void dispose() {

}

@Override
public boolean isScreencast() {
    return true;
}}

Resulting stream looks in following way:

Corrupted frame

Below, I'll try to give the results of my experiments so far.

  1. In case frame rotated to 90 degrees - stream looks normal.

    VideoFrame videoFrame = new VideoFrame(buffer.toI420(), 90, frameTime);
    

Frame rotated

  1. Tried to swap TextureBuffer size

    TextureBufferImpl buffer = new TextureBufferImpl(height, width,
        VideoFrame.TextureBuffer.Type.RGB, textures[0],  matrix,
        textureHelper.getHandler(), yuvConverter, null);
    

Frame swapped

  1. Also tried to pass height as both height and width

    TextureBufferImpl buffer = new TextureBufferImpl(height, height,
        VideoFrame.TextureBuffer.Type.RGB, textures[0],  matrix,
        textureHelper.getHandler(), yuvConverter, null);
    

Stretched frame

And I'm confused here. It seems that WebRtc somehow expects horizontal frame despite all sizes were set to vertical. I've tried to log all frame buffer sizes inside WebRtc library up to video encoder and they are correct. The issue doesn't seem to be with convertation method, because except this I've tried to convert using ARGBToI420 from libyuv. The produced results were the same.

Would be extremely grateful for any assistance

Serhii Petrenko
  • 320
  • 2
  • 12
  • perhaps the webrtc is enforcing horizontal screen configurations. check if there as some default options that you can adjut to make things work in vertical. – Harkal Jun 17 '20 at 15:55
  • Have you found the solution yet? – Jaswant Singh Oct 06 '20 at 14:07
  • Why are you creating a new texture? To TextureBufferImpl you are supposed to pass the texture you want to read/convert from. Creating a new texture does not do much. But from your output, I guess your newly created texture is invalid (check if its value is 0), in which case TextureBufferImpl probably uses the texture that was previously active in the current context. Most likely that will be the system texture that draws your application, unless you are doing OpenGL on the main thread. – Pau Guillamon Sep 14 '22 at 16:00

0 Answers0