29

In my application, we need to display the Video frame receives from server to our android application,
Server is sending video data @ 50 frame per second, having encoded in WebM i.e. using libvpx to encode and decode the images,

Now after decoding from libvpx its getting YUV data, that we can displayed over the image layout,

the current implementation is something like this,

In JNI / Native C++ code, we are converting YUV data to RGB Data In Android framework, calling

public Bitmap createImgae(byte[] bits, int width, int height, int scan) {
    Bitmap bitmap=null;
    System.out.println("video: creating bitmap");
    //try{

            bitmap = Bitmap.createBitmap(width, height,
                    Bitmap.Config.ARGB_8888);
            bitmap.copyPixelsFromBuffer(ByteBuffer.wrap(bits));     

    //}catch(OutOfMemoryError ex){

    //}
            System.out.println("video: bitmap created");
    return bitmap;
}  

To create the bitmap image ,

to display the image over imageView using following code,

               img = createImgae(imgRaw, imgInfo[0], imgInfo[1], 1);
               if(img!=null && !img.isRecycled()){

                    iv.setImageBitmap(img);
                    //img.recycle();
                    img=null;
                    System.out.println("video: image displayed");
                }

My query is, overall this function is taking approx 40 ms, is there any way to optimize it,
1 -- Is there any way to display YUV data to imageView ?

2 -- Is there any other way to create Image( Bitmap image) from RGB data ,

3 -- I believe i am always creating image, but i suppose i should create bitmap only once and do / supply new buffer always, as and when we received.
please share your views.

Amitg2k12
  • 3,765
  • 10
  • 48
  • 97
  • `bitmap.copyPixelsFromBuffer(ByteBuffer.wrap(bits));` does the conversion from YUV to RGB? Or are you saying that's what you are doing in native? – weston Feb 08 '12 at 12:11
  • Here is one way to do it. Check out http://stackoverflow.com/questions/9192982/displaying-yuv-image-in-android – bob Jun 30 '12 at 20:34

5 Answers5

46

Following code solve your problem and it may take less time on Yuv Format data because YuvImage class is provided with Android-SDK.

You can try this,

ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] imageBytes = out.toByteArray();
Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
iv.setImageBitmap(image);

or

void yourFunction(byte[] data, int mWidth, int mHeight)
{

int[] mIntArray = new int[mWidth*mHeight];

// Decode Yuv data to integer array
decodeYUV420SP(mIntArray, data, mWidth, mHeight);

//Initialize the bitmap, with the replaced color  
Bitmap bmp = Bitmap.createBitmap(mIntArray, mWidth, mHeight, Bitmap.Config.ARGB_8888);  

// Draw the bitmap with the replaced color  
iv.setImageBitmap(bmp);  

}

static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
    int height) {
final int frameSize = width * height;

for (int j = 0, yp = 0; j < height; j++) {
    int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
    for (int i = 0; i < width; i++, yp++) {
        int y = (0xff & ((int) yuv420sp[yp])) - 16;
        if (y < 0)
            y = 0;
        if ((i & 1) == 0) {
            v = (0xff & yuv420sp[uvp++]) - 128;
            u = (0xff & yuv420sp[uvp++]) - 128;
        }

        int y1192 = 1192 * y;
        int r = (y1192 + 1634 * v);
        int g = (y1192 - 833 * v - 400 * u);
        int b = (y1192 + 2066 * u);

        if (r < 0)
            r = 0;
        else if (r > 262143)
            r = 262143;
        if (g < 0)
            g = 0;
        else if (g > 262143)
            g = 262143;
        if (b < 0)
            b = 0;
        else if (b > 262143)
            b = 262143;

        // rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
        // 0xff00) | ((b >> 10) & 0xff);
        // rgba, divide 2^10 ( >> 10)
        rgba[yp] = ((r << 14) & 0xff000000) | ((g << 6) & 0xff0000)
                | ((b >> 2) | 0xff00);
    }
}
}
Jay Snayder
  • 4,298
  • 4
  • 27
  • 53
Hitesh Patel
  • 2,868
  • 2
  • 33
  • 62
  • 2
    I tried your method decodeYUV420SP() but the image that it creates is not good. It is full of yellow and green waves. I did this one and it's works: http://stackoverflow.com/questions/5272388/need-help-with-androids-nv21-format/12702836#12702836 – Derzu Oct 05 '12 at 14:24
  • 1
    I don't understand why the rgba[yp]=... is shifted by 8 bits. The commented out line is more correct. I get an rotated image. – over_optimistic Apr 02 '13 at 14:15
  • 7
    is there a way of doing this with lossless compression (i.e. not JPEG)? – Archimedes Trajano Aug 01 '13 at 17:48
  • also see my question http://stackoverflow.com/questions/29645950/how-can-i-add-thermal-effect-to-yuv-image and http://stackoverflow.com/questions/29649137/how-to-modify-rgb-pixel-of-an-bitmap-to-look-different – Zar E Ahmer Apr 15 '15 at 15:02
  • 2
    YuvImage is quite useless for the purpose, converting to JPG and back to RGB each frame is too slow for video playback, not to say that JPG is lossy format so the video will look worse than original. – Pointer Null Dec 03 '16 at 21:24
  • Doesn't support YUV420P. – Johann Apr 17 '21 at 17:17
2

Another way would be using ScriptIntrinsicYuvToRGB, this is more efficient then encoding (and decoding) each time a JPEG

fun yuvByteArrayToBitmap(bytes: ByteArray, width: Int, height: Int): Bitmap {
    val rs = RenderScript.create(this)

    val yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));
    val yuvType = Type.Builder(rs, Element.U8(rs)).setX(bytes.size);
    val input = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);

    val rgbaType = Type.Builder(rs, Element.RGBA_8888(rs)).setX(width).setY(height);
    val output = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);

    input.copyFrom(bytes);

    yuvToRgbIntrinsic.setInput(input);
    yuvToRgbIntrinsic.forEach(output);

    val bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
    output.copyTo(bitmap)

    input.destroy()
    output.destroy()
    yuvToRgbIntrinsic.destroy()
    rs.destroy()

    return bitmap
}
wiomoc
  • 1,069
  • 10
  • 17
1

Create a bitmap after getting Width and height in onCreate.

editedBitmap = Bitmap.createBitmap(widthPreview, heightPreview,
                android.graphics.Bitmap.Config.ARGB_8888);

And in onPreviewFrame.

int[] rgbData = decodeGreyscale(aNv21Byte,widthPreview,heightPreview);
editedBitmap.setPixels(rgbData, 0, widthPreview, 0, 0, widthPreview, heightPreview);

And

private int[] decodeGreyscale(byte[] nv21, int width, int height) {
    int pixelCount = width * height;
    int[] out = new int[pixelCount];
    for (int i = 0; i < pixelCount; ++i) {
        int luminance = nv21[i] & 0xFF;
       // out[i] = Color.argb(0xFF, luminance, luminance, luminance);
        out[i] = 0xff000000 | luminance <<16 | luminance <<8 | luminance;//No need to create Color object for each.
    }
    return out;
}

And Bonus.

if(cameraId==CameraInfo.CAMERA_FACING_FRONT)
{   
    matrix.setRotate(270F);
}

finalBitmap = Bitmap.createBitmap(editedBitmap, 0, 0, widthPreview, heightPreview, matrix, true);
Zar E Ahmer
  • 33,936
  • 20
  • 234
  • 300
  • Actually, this answers a different [question](https://stackoverflow.com/questions/14936829/create-grayscale-bitmap-image-from-byte-in-android). – Alex Cohn Mar 10 '19 at 12:31
1

Base on the answer of @wiomoc, here is the Java version:

Bitmap yuvByteArrayToBitmap(byte[] bytes, int width, int height)
{
    RenderScript rs = RenderScript.create(this);

    ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));
    Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(bytes.length);
    Allocation input = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);

    Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(width).setY(height);
    Allocation output = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);

    input.copyFrom(bytes);

    yuvToRgbIntrinsic.setInput(input);
    yuvToRgbIntrinsic.forEach(output);

    Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
    output.copyTo(bitmap);

    input.destroy();
    output.destroy();
    yuvToRgbIntrinsic.destroy();
    rs.destroy();

    return bitmap;
}
Hank Chang
  • 199
  • 2
  • 10
0

Based on the accepted answer I could find a quite faster way to make the YUV to RGB convertion using RenderScript intrinsict convertion method. I have found the direct example here: Yuv2RgbRenderScript.

It can be as simple as copy the convertYuvToRgbIntrinsic method in the RenderScriptHelper class to replace the decodeYUV420SP that Hitesh Patel give in his answer. Also, you will need to initialize a RenderScript object (the example is in the MainActivity class).

And don't forget to add in the project graddle the use of render script (in the android page you can find the way to do it).

S. serra
  • 11
  • 5