32

How to convert Bitmap returned by BitmapFactory.decodeFile() to YUV format (simillar to what camera's onPreviewFrame() returns in byte array)?

Abhijeet Pathak
  • 1,948
  • 3
  • 20
  • 28

6 Answers6

58

Here is some code that actually works:

    // untested function
    byte [] getNV21(int inputWidth, int inputHeight, Bitmap scaled) {

        int [] argb = new int[inputWidth * inputHeight];

        scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);

        byte [] yuv = new byte[inputWidth*inputHeight*3/2];
        encodeYUV420SP(yuv, argb, inputWidth, inputHeight);

        scaled.recycle();

        return yuv;
    }

    void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
        final int frameSize = width * height;

        int yIndex = 0;
        int uvIndex = frameSize;

        int a, R, G, B, Y, U, V;
        int index = 0;
        for (int j = 0; j < height; j++) {
            for (int i = 0; i < width; i++) {

                a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
                R = (argb[index] & 0xff0000) >> 16;
                G = (argb[index] & 0xff00) >> 8;
                B = (argb[index] & 0xff) >> 0;

                // well known RGB to YUV algorithm
                Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                U = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                V = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
                //    meaning for every 4 Y pixels there are 1 V and 1 U.  Note the sampling is every other
                //    pixel AND every other scanline.
                yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
                if (j % 2 == 0 && index % 2 == 0) { 
                    yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
                    yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
                }

                index ++;
            }
        }
    }
Bo Persson
  • 90,663
  • 31
  • 146
  • 203
Fracdroid
  • 1,135
  • 10
  • 15
  • Thanks! I'll try it and let you know. – Abhijeet Pathak Oct 27 '12 at 07:32
  • 6
    If anyone needs YV12 instead of NV21, this answer can be modified slightly to produce everyone's favorite tri-planar format instead: https://gist.github.com/wobbals/5725412 – wobbals Jun 06 '13 at 22:13
  • 5
    According to yuv420sp spec, the U-value is stored before V in the last 3rd of the buffer, so I had to swap the rows inside the if-case to prevent blue/reds from being flipped. – Mattias May 20 '14 at 08:04
  • Does anyone know if the `stride` component plays any part here? I'm having a problem converting an image where `stride != width`. Getting the image rendered with as angled horizontal stripes on the screen instead. – Roberto Andrade Feb 26 '15 at 20:48
  • Hmmm, seems like it should be possible to just change the rectangle you are rendering too. For instance, if you are using an array created from a 240x120 image, then rendering to a 250x110 rectangle should get you the results you want. Of course you will need to be mindful of overflow or underflow of the array. – Fracdroid Feb 27 '15 at 00:40
  • also see my question http://stackoverflow.com/questions/29645950/how-can-i-add-thermal-effect-to-yuv-image and http://stackoverflow.com/questions/29649137/how-to-modify-rgb-pixel-of-an-bitmap-to-look-different – Zar E Ahmer Apr 15 '15 at 15:11
  • Does NV21 support alpha component? If yes, how can I take the alpha component into account? – Bao Le Jul 20 '15 at 09:42
  • NV21 does not support alpha. Are you trying to maintain alpha for instance from a PNG? You could try to choose a color like GREEN (remember green screen?) and interpret that as alpha. This is what TV stations do to allow the weather man to appear to be in front of a large screen. He is actually just standing in front of a green curtain. – Fracdroid Aug 04 '15 at 18:22
  • 2
    I had problems using this method on a bitmap from `BitmapFactory.decodeResource()`, but it works fine when I instead load the bitmap from `BitmapFactory.decodeStream(getAssets().open("myasset.bmp"))`. – VinceFior Jan 11 '16 at 20:51
  • @Fracdroid You are starting `uvIndex` from frameSize. Won't it give `ArrayIndexOutOfBoundException`? I think it shound be started from 0. – Nabin Bhandari Dec 19 '16 at 15:09
  • The alpha channel is not used. How to count the alpha channel? – Lewis Z Jul 07 '17 at 04:39
  • 8
    encodeYUV420SP throws ArrayIndexOutOfRange Exception when inputWidth or inputHeight is an odd number. Declaring the yuv array as `byte [] yuv = new byte[inputHeight * inputWidth + 2 * (int) Math.ceil(inputHeight/2.0) *(int) Math.ceil(inputWidth/2.0)];` solved the issue. – Mike Mat May 01 '18 at 11:01
  • 2
    DON'T ever use Java for such conversation, use RenderScript or Lubyuv library. Java clear conversation is VERY slow – user25 Aug 15 '18 at 23:22
  • how to convert the bytes[] result to plane? which part is Y, which part is U and which part is V? – Jesse May 28 '19 at 06:22
  • this worked 99% for me... I'm using picasso to load the bitmap and after running the whole process the video was getting an weird color palete... I then swapped ´R´ with ´B´ into the ´encodeYUV420SP´ and it worked 100% – Rafael Lima Oct 11 '19 at 02:05
  • I am trying to convert the byte array returned by _getNV21_ to rgb opencv mat and it is resulting in a crash. Although you mention it is NV21, COLOR_YUV2RGB_NV12 constant fails to convert it. You can check my question here - https://stackoverflow.com/questions/63260714/convert-yuv-byte-array-to-rgb-mat – yeshu Aug 05 '20 at 07:57
  • 1
    @user25 I would love to see a render script solution to this – Fracdroid Mar 10 '21 at 20:04
  • @MikeMat You saved my day, tnx!!!!!! – Dmitry Zinoviev Apr 25 '23 at 11:30
4

Following is the code for converting Bitmap to Yuv(NV21) Format.

void yourFunction(){

    // mBitmap is your bitmap

    int mWidth = mBitmap.getWidth();
    int mHeight = mBitmap.getHeight();

    int[] mIntArray = new int[mWidth * mHeight];

    // Copy pixel data from the Bitmap into the 'intArray' array
    mBitmap.getPixels(mIntArray, 0, mWidth, 0, 0, mWidth, mHeight);

    // Call to encoding function : convert intArray to Yuv Binary data
    encodeYUV420SP(data, intArray, mWidth, mHeight);

}

static public void encodeYUV420SP(byte[] yuv420sp, int[] rgba,
        int width, int height) {
    final int frameSize = width * height;

    int[] U, V;
    U = new int[frameSize];
    V = new int[frameSize];

    final int uvwidth = width / 2;

    int r, g, b, y, u, v;
    for (int j = 0; j < height; j++) {
        int index = width * j;
        for (int i = 0; i < width; i++) {

            r = Color.red(rgba[index]);
            g = Color.green(rgba[index]);
            b = Color.blue(rgba[index]);

            // rgb to yuv
            y = (66 * r + 129 * g + 25 * b + 128) >> 8 + 16;
            u = (-38 * r - 74 * g + 112 * b + 128) >> 8 + 128;
            v = (112 * r - 94 * g - 18 * b + 128) >> 8 + 128;

            // clip y
            yuv420sp[index] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y));
            U[index] = u;
            V[index++] = v;
        }
    }
Nabin Bhandari
  • 15,949
  • 6
  • 45
  • 59
Hitesh Patel
  • 2,868
  • 2
  • 33
  • 62
0

If using java to convert Bitmap to YUV byte[] is too slow for you, you can try libyuv by Google

Sanster
  • 1,068
  • 1
  • 9
  • 12
0

Via OpenCV library you can replace encodeYUV420SP java function with one native OpenCV line and it is ~4x more fastest:

Mat mFrame = Mat(height,width,CV_8UC4,pFrameData).clone();

Complete example:

Java side:

    Bitmap bitmap = mTextureView.getBitmap(mWidth, mHeight);
    int[] argb = new int[mWidth * mHeight];
    // get ARGB pixels and then proccess it with 8UC4 opencv convertion
    bitmap.getPixels(argb, 0, mWidth, 0, 0, mWidth, mHeight);
    // native method (NDK or CMake)
    processFrame8UC4(argb, mWidth, mHeight);

Native side (NDK):

JNIEXPORT jint JNICALL com_native_detector_Utils_processFrame8UC4
    (JNIEnv *env, jobject object, jint width, jint height, jintArray frame) {

    jint *pFrameData = env->GetIntArrayElements(frame, 0);
    // it is the line:
    Mat mFrame = Mat(height,width,CV_8UC4,pFrameData).clone();
    // the next only is a extra example to gray convertion:
    Mat mout;
    cvtColor(mFrame, mout,CV_RGB2GRAY);
    int objects = face_detection(env, mout);
    env->ReleaseIntArrayElements(frame, pFrameData, 0);
    return objects;
}
Hpsaturn
  • 2,702
  • 31
  • 38
-1

first you calculate the rgb data:

r=(p>>16) & 0xff;
g=(p>>8) & 0xff;
b= p & 0xff;
y=0.2f*r+0.7*g+0.07*b;
u=-0.09991*r-0.33609*g+0.436*b;
v=0.615*r-0.55861*g-0.05639*b;

y, u and v are the composants of the yuv matrix.

ahakkal
  • 11
  • 1
  • 6
-1

The bmp file will be in RGB888 format, so you will need to convert it to YUV. I have not come across any api in Android that will do this for you.

But you can do this yourself, see this link on how to..

bluefalcon
  • 4,225
  • 1
  • 32
  • 41