0

I am trying to encode an .h264 video by using MediaCodec and Camera (onPreviewFrame). I got stuck converting color space from YV12 (from camera) to COLOR_FormatYUV420SemiPlanar (needed by the encoder).

Edit: I noticed this can be a bug on MediaCodec since the following code works on other devices:

public static byte[] YV12toYUV420PackedSemiPlanar(final byte[] input, final  byte[] output, final int width, final int height) {
    /*
     * COLOR_TI_FormatYUV420PackedSemiPlanar is NV12
     * We convert by putting the corresponding U and V bytes together (interleaved).
     */
    final int frameSize = width * height;
    final int qFrameSize = frameSize / 4;

    System.arraycopy(input, 0, output, 0, frameSize); // Y

    for (int i = 0; i < qFrameSize; i++) {
        output[frameSize + i * 2] = input[frameSize + i + qFrameSize]; // Cb (U)
        output[frameSize + i * 2 + 1] = input[frameSize + i]; // Cr (V)
    }
    return output;
}

This is the result I get (seems like color bits have some offset):

enter image description here

Edit 2: Frame size is 1280x720, device is Samsung s5(SM-G900V) with OMX.qcom.video.encoder.avc running Android Lollipop 5.0 (API 21).

Note: I know about COLOR_FormatSurface but I need to make this work on API 16.

Community
  • 1
  • 1
user2880229
  • 151
  • 8

2 Answers2

0

If this is running on a Qualcomm device prior to Android 4.3, you need to align the start of the U/V plane to a 2048 byte boundary. Something like this might work:

public static byte[] YV12toYUV420PackedSemiPlanar(final byte[] input, final  byte[] output, final int width, final int height) {
    final int frameSize = width * height;
    final int alignedFrameSize = (frameSize + 2047)/2048*2048;
    final int qFrameSize = frameSize / 4;

    System.arraycopy(input, 0, output, 0, frameSize); // Y

    for (int i = 0; i < qFrameSize; i++) {
        output[alignedFrameSize + i * 2] = input[frameSize + i + qFrameSize]; // Cb (U)
        output[alignedFrameSize + i * 2 + 1] = input[frameSize + i]; // Cr (V)
    }
    return output;
}

This is a pretty well-known issue; prior to Android 4.3, the input formats to encoders weren't really tested strictly, so encoders could basically do whatever they wanted. (Beware, Samsung's encoders will behave even worse.) See https://code.google.com/p/android/issues/detail?id=37769 for a collection of other known issues.

mstorsjo
  • 12,983
  • 2
  • 39
  • 62
  • Thank you for your response but unfortunately this did not fix the issue. I added some some specifications about the device and frame size (Edit 2). The result looks exactly the same, also I read this alignment issue does not affect to 720p videos. http://stackoverflow.com/questions/20699009/how-to-get-stride-and-y-plane-alignment-values-for-mediacodec-encoder?rq=1 – user2880229 Dec 08 '15 at 20:00
  • 1
    You've pretty clearly got some sort of issue where you're not starting from the right place on the UV plane. How many bytes of data did you receive for the frame? Is it exactly 1280*720*1.5? – fadden Dec 11 '15 at 16:26
0

You can try this

public byte[] YV12toYUV420PackedSemiPlanar(final byte[] input, final byte[] output, final int width, final int height)
{
    for (int i = 0; i < height; i++)
        System.arraycopy(input, yStride * i, output, yStride * i, yStride); // Y

    for (int i = 0; i < halfHeight; i++) {
        for (int j = 0; j < halfWidth; j++) {
            output[ySize + (i * halfWidth + j) * 2] = input[ySize + cSize + i * cStride + j]; // Cb (U)
            output[ySize + (i * halfWidth + j) * 2 + 1] = input[ySize + i * cStride + j]; // Cr (V)
        }
    }

    return output;
}
Teocci
  • 7,189
  • 1
  • 50
  • 48