41

I am capturing image using SurfaceView and getting Yuv Raw preview data in public void onPreviewFrame4(byte[] data, Camera camera)

I have to perform some image preprocessing in onPreviewFrame so i need to convert Yuv preview data to RGB data than image preprocessing and back to Yuv data.

I have used both function for encoding and decoding Yuv data to RGB as following :

public void onPreviewFrame(byte[] data, Camera camera) {
    Point cameraResolution = configManager.getCameraResolution();
    if (data != null) {
        Log.i("DEBUG", "data Not Null");

                // Preprocessing
                Log.i("DEBUG", "Try For Image Processing");
                Camera.Parameters mParameters = camera.getParameters();
                Size mSize = mParameters.getPreviewSize();
                int mWidth = mSize.width;
                int mHeight = mSize.height;
                int[] mIntArray = new int[mWidth * mHeight];

                // Decode Yuv data to integer array
                decodeYUV420SP(mIntArray, data, mWidth, mHeight);

                // Converting int mIntArray to Bitmap and 
                // than image preprocessing 
                // and back to mIntArray.

                // Encode intArray to Yuv data
                encodeYUV420SP(data, mIntArray, mWidth, mHeight);
                    }
}

    static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
        int height) {
    final int frameSize = width * height;

    for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
        for (int i = 0; i < width; i++, yp++) {
            int y = (0xff & ((int) yuv420sp[yp])) - 16;
            if (y < 0)
                y = 0;
            if ((i & 1) == 0) {
                v = (0xff & yuv420sp[uvp++]) - 128;
                u = (0xff & yuv420sp[uvp++]) - 128;
            }

            int y1192 = 1192 * y;
            int r = (y1192 + 1634 * v);
            int g = (y1192 - 833 * v - 400 * u);
            int b = (y1192 + 2066 * u);

            if (r < 0)
                r = 0;
            else if (r > 262143)
                r = 262143;
            if (g < 0)
                g = 0;
            else if (g > 262143)
                g = 262143;
            if (b < 0)
                b = 0;
            else if (b > 262143)
                b = 262143;

            // rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
            // 0xff00) | ((b >> 10) & 0xff);
            // rgba, divide 2^10 ( >> 10)
            rgba[yp] = ((r << 14) & 0xff000000) | ((g << 6) & 0xff0000)
                    | ((b >> 2) | 0xff00);
        }
    }
}


    static public void encodeYUV420SP_original(byte[] yuv420sp, int[] rgba,
        int width, int height) {
    final int frameSize = width * height;

    int[] U, V;
    U = new int[frameSize];
    V = new int[frameSize];

    final int uvwidth = width / 2;

    int r, g, b, y, u, v;
    for (int j = 0; j < height; j++) {
        int index = width * j;
        for (int i = 0; i < width; i++) {
            r = (rgba[index] & 0xff000000) >> 24;
            g = (rgba[index] & 0xff0000) >> 16;
            b = (rgba[index] & 0xff00) >> 8;

            // rgb to yuv
            y = (66 * r + 129 * g + 25 * b + 128) >> 8 + 16;
            u = (-38 * r - 74 * g + 112 * b + 128) >> 8 + 128;
            v = (112 * r - 94 * g - 18 * b + 128) >> 8 + 128;

            // clip y
            yuv420sp[index++] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y));
            U[index] = u;
            V[index++] = v;
        }
    }

The problem is that encoding and decoding Yuv data might have some mistake because if i skip the preprocessing step than also encoded Yuv data are differ from original data of PreviewCallback.

Please help me to resolve this issue. I have to used this code in OCR scanning so i need to implement this type of logic.

If any other way of doing same thing than please provide me.

Thanks in advance. :)

Hitesh Patel
  • 2,868
  • 2
  • 33
  • 62

10 Answers10

51

Although the documentation suggests that you can set which format the image data should arrive from the camera in, in practice you often have a choice of one: NV21, a YUV format. For lots of information on this format see http://www.fourcc.org/yuv.php#NV21 and for information on the theory behind converting it to RGB see http://www.fourcc.org/fccyvrgb.php. There is a picture based explanation at Extract black and white image from android camera's NV21 format. There is an android specific section on a wikipedia page about the subject (thanks @AlexCohn): YUV#Y'UV420sp (NV21) to RGB conversion (Android).

However, once you've set up your onPreviewFrame routine, the mechanics of going from the byte array it sends you to useful data is somewhat, ummmm, unclear. From API 8 onwards, the following solution is available, to get to a ByteStream holiding a JPEG of the image (compressToJpeg is the only conversion option offered by YuvImage):

// pWidth and pHeight define the size of the preview Frame
ByteArrayOutputStream out = new ByteArrayOutputStream();

// Alter the second parameter of this to the actual format you are receiving
YuvImage yuv = new YuvImage(data, ImageFormat.NV21, pWidth, pHeight, null);

// bWidth and bHeight define the size of the bitmap you wish the fill with the preview image
yuv.compressToJpeg(new Rect(0, 0, bWidth, bHeight), 50, out);

This JPEG may then need to be converted into the format you want. If you want a Bitmap:

byte[] bytes = out.toByteArray();
Bitmap bitmap= BitmapFactory.decodeByteArray(bytes, 0, bytes.length);

If, for whatever reason, you are unable to do this, you can do the conversion manually. Some problems to be overcome in doing this:

  1. The data arrives in a byte array. By definition, bytes are signed numbers, meaning that they go from -128 to 127. However, the data is actually unsigned bytes (0 to 255). If this isn't dealt with, the outcome is doomed to have some odd clipping effects.

  2. The data is in a very specific order (as per the previously mentioned web pages) and each pixel needs to be extracted carefully.

  3. Each pixel needs to be put into the right place on a bitmap, say. This also requires a rather messy (in my view) approach of building a buffer of the data and then filling a bitmap from it.

  4. In principle, the values should be stored [16..240], but it appears that they are stored [0..255] in the data sent to onPreviewFrame

  5. Just about every web page on the matter proposes different coefficients, even allowing for [16..240] vs [0..255] options.

  6. If you've actually got NV12 (another variant on YUV420), then you will need to swap the reads for U and V.

I present a solution (which seems to work), with requests for corrections, improvements and ways of making the whole thing less costly to run. I have set it out to hopefully make clear what is happening, rather than to optimise it for speed. It creates a bitmap the size of the preview image:

The data variable is coming from the call to onPreviewFrame

// Define whether expecting [16..240] or [0..255]
boolean dataIs16To240 = false;

// the bitmap we want to fill with the image
Bitmap bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ARGB_8888);
int numPixels = imageWidth*imageHeight;

// the buffer we fill up which we then fill the bitmap with
IntBuffer intBuffer = IntBuffer.allocate(imageWidth*imageHeight);
// If you're reusing a buffer, next line imperative to refill from the start,
// if not good practice
intBuffer.position(0);

// Set the alpha for the image: 0 is transparent, 255 fully opaque
final byte alpha = (byte) 255;

// Holding variables for the loop calculation
int R = 0;
int G = 0;
int B = 0;

// Get each pixel, one at a time
for (int y = 0; y < imageHeight; y++) {
    for (int x = 0; x < imageWidth; x++) {
        // Get the Y value, stored in the first block of data
        // The logical "AND 0xff" is needed to deal with the signed issue
        float Y = (float) (data[y*imageWidth + x] & 0xff);

        // Get U and V values, stored after Y values, one per 2x2 block
        // of pixels, interleaved. Prepare them as floats with correct range
        // ready for calculation later.
        int xby2 = x/2;
        int yby2 = y/2;

        // make this V for NV12/420SP
        float U = (float)(data[numPixels + 2*xby2 + yby2*imageWidth] & 0xff) - 128.0f;

        // make this U for NV12/420SP
        float V = (float)(data[numPixels + 2*xby2 + 1 + yby2*imageWidth] & 0xff) - 128.0f;

        if (dataIs16To240) {
            // Correct Y to allow for the fact that it is [16..235] and not [0..255]
            Y = 1.164*(Y - 16.0);

            // Do the YUV -> RGB conversion
            // These seem to work, but other variations are quoted
            // out there.
            R = (int)(Yf + 1.596f*V);
            G = (int)(Yf - 0.813f*V - 0.391f*U);
            B = (int)(Yf            + 2.018f*U);
        }
        else {
            // No need to correct Y
            // These are the coefficients proposed by @AlexCohn
            // for [0..255], as per the wikipedia page referenced
            // above
            R = (int)(Yf + 1.370705f*V);
            G = (int)(Yf - 0.698001f*V - 0.337633f*U);
            B = (int)(Yf               + 1.732446f*U);
        }
              
        // Clip rgb values to 0-255
        R = R < 0 ? 0 : R > 255 ? 255 : R;
        G = G < 0 ? 0 : G > 255 ? 255 : G;
        B = B < 0 ? 0 : B > 255 ? 255 : B;

        // Put that pixel in the buffer
        intBuffer.put(alpha*16777216 + R*65536 + G*256 + B);
    }
}

// Get buffer ready to be read
intBuffer.flip();

// Push the pixel information from the buffer onto the bitmap.
bitmap.copyPixelsFromBuffer(intBuffer);

As @Timmmm points out below, you could do the conversion in int by multiplying the scaling factors by 1000 (ie. 1.164 becomes 1164) and then dividng the end results by 1000.

Neil Townsend
  • 6,024
  • 5
  • 35
  • 52
  • 1
    Your clipping lines are a bit weird. You can remove either the first, or last two `R=` on each line. Also what is `numPixels`, and `float` is going to be quite slow; I'm pretty sure you can do this with just integers. – Timmmm Sep 11 '12 at 12:09
  • Good points, I've ammended the code (first two points) and added a comment (third point) - I've left the core code as it is with floats because it's (I think) a bit clearer what's going on then, but you are right that it would be faster moving to ints. – Neil Townsend Oct 15 '12 at 13:02
  • I am getting: java.lang.ArrayIndexOutOfBoundsException on droid – zezba9000 Jan 23 '13 at 16:22
  • @Neil Townsend Sry, it works fine.... I was setting a height value to a width one... My bug, sry for the confusion. – zezba9000 Jan 24 '13 at 04:19
  • @Paul `previewBoxWidth` shouldn't be in the code, it should be `imageWidth` as now corrected. Thanks for pointing the slip out. – Neil Townsend Jun 03 '13 at 16:15
  • 2
    Am I the only one to think converting to JPEG and back to bitmap is a bit weird? – Violet Giraffe Sep 23 '14 at 11:52
  • @VioletGiraffe not sure I understand your comment - the data generated coming into this routine is a YUV format with no application of compression other than number of bits used (unlike JPEG). The goal is to get the data into a format usable for display. You could go from the incoming data to JPEG if you wish, but it would be more computation. – Neil Townsend Sep 24 '14 at 06:24
  • @NeilTownsend: I am referring to your first part of the answer that uses `compressToJpeg`. – Violet Giraffe Sep 24 '14 at 09:07
  • @VioletGiraffe Thanks for clarifying. I've tried to improve the answer to explain the process better, hope that helps. – Neil Townsend Sep 24 '14 at 14:27
  • also have a look at my question http://stackoverflow.com/questions/29649137/how-to-modify-rgb-pixel-of-an-bitmap-to-look-different and http://stackoverflow.com/questions/29645950/how-can-i-add-thermal-effect-to-yuv-image – Zar E Ahmer Apr 15 '15 at 15:10
  • Unfortunately, this is a wrong formula. It is for YUV video, as defined in ITU-R BT.601 standard, with Y range of [16..235]. The camera image comes in YCbCr color space with Y range of [0..255]. The change is not very strong, but significant for the human eye. – Alex Cohn Jun 22 '20 at 12:18
  • 1
    @AlexCohn You are correct that Y is [16..235], and I apologise for not stating that specifically. The code did, however, incorporate it: the calculation of Yf (first line after the "Do the YUV ..." comment handles it). I will updated my answer to make this clear, thanks. – Neil Townsend Jun 24 '20 at 12:38
  • @AlexCohn I hope that I have made it clearer now, if you have other suggestions for improvement please do let me know. (My previous comment about where the line was no longer makes sense now, of course!) – Neil Townsend Jun 24 '20 at 12:43
  • 1
    The problem was/is not that the code is wrong, or not relevant. The BT.601 color space is entirely legitimate; this is what you typically get from video decoders. The problem, let me reiterate, is that in the very specific situation when we need to convert a N21 frame arriving in Android camera onPreviewFrame callback, the color space is full [0..255] as in Jpeg, and can be easily verified by comparing the results of the first and the second code snippets. This remains true after 8 years, too, with the new Camera2 and cameraX APIs. – Alex Cohn Jun 25 '20 at 08:23
  • 1
    @AlexCohn To check I’ve understood: you are saying that the code is correct for the conversion in general, but for the specific case of an onPreviewFrame callback, which is what the original question is about, the Yf conversion line should simply be Yf=(float)Y ? – Neil Townsend Jun 27 '20 at 17:09
  • It's a bit more than change the formula for Yf. Actually, all coefficients change, see https://en.wikipedia.org/wiki/YUV#Y%E2%80%B2UV420sp_(NV21)_to_RGB_conversion_(Android). `R = Y + (1.370705 * V); G = Y - (0.698001 * V) - (0.337633 * U); B = Y + (1.732446 * U);` – Alex Cohn Jun 28 '20 at 12:31
  • 1
    @AlexCohn Many thanks, I hope that I have managed to amend the answer to cover what you are saying. If I am still not quite there, please do say so! – Neil Townsend Jun 29 '20 at 14:59
  • i love this answer. can you please elaborate on using the jpeg byte array. like converting that jpeg byte array into rgb byte array. ?? – Harkal Nov 20 '20 at 04:16
  • @Harkal As can be seen from https://developer.android.com/reference/android/graphics/YuvImage the only way to get the data out of yuvimage other than in yuv format is to get a jpeg out. You then need to convert that to the format you want. To get a bitmap, use the bitmapfactory code as shown. Not sure what else there is to say! – Neil Townsend Nov 21 '20 at 15:02
15

Why not specify that camera preview should provide RGB images?

i.e. Camera.Parameters.setPreviewFormat(ImageFormat.RGB_565);

Reuben Scratton
  • 38,595
  • 9
  • 77
  • 86
  • Thanks @Reuben, :) I have set PreviewFormat and now, no need for conversion. – Hitesh Patel Feb 17 '12 at 15:33
  • 72
    Notice that this will not work on all devices. The default format is YUV and devices that does not support preview as RGB will still provide you with YUV formatted images. – Muzikant Apr 21 '12 at 07:00
  • 4
    YUV21 and YUV12 are the imageformats supported by all cameras for using other format check if your camera does support the feature (if not the callback will still provide data in YUV21 format). mCamera = Camera.open(); Camera.Parameters params = mCamera.getParameters(); for(int i: params.getSupportedPreviewFormats()) { Log.e(TAG, "preview format supported are = "+i);} – amIT May 07 '14 at 13:00
8

You can use RenderScript -> ScriptIntrinsicYuvToRGB

Kotlin Sample

val rs = RenderScript.create(CONTEXT_HERE)
val yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs))

val yuvType = Type.Builder(rs, Element.U8(rs)).setX(byteArray.size)
val inData = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT)

val rgbaType = Type.Builder(rs, Element.RGBA_8888(rs)).setX(width).setY(height)
val outData = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT)

inData.copyFrom(byteArray)

yuvToRgbIntrinsic.setInput(inData)
yuvToRgbIntrinsic.forEach(outData)

val bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
outData.copyTo(bitmap)
Mateen Ulhaq
  • 24,552
  • 19
  • 101
  • 135
murgupluoglu
  • 6,524
  • 4
  • 33
  • 43
  • 1
    With a quick benchmark I found out that this performs ~4 times faster than the solution proposed by Neil with compressToJpeg – Ehsan Khaveh Feb 05 '20 at 18:42
  • Unfortunately, the intrinsic conversion is tuned for video processing, not for camera image stream (see [the formulae](https://android.googlesource.com/platform/frameworks/rs/+/refs/heads/master/cpu_ref/rsCpuIntrinsicYuvToRGB.cpp#61)). I have a [fix](https://stackoverflow.com/a/63575255/192373) that resolves this. – Alex Cohn Aug 25 '20 at 09:25
6

After some tests on Samsung S4 mini fastest code is (120% faster then Neil's [floats!] and 30% faster then original Hitesh's):

static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
                                  int height) {


    final int frameSize = width * height;
// define variables before loops (+ 20-30% faster algorithm o0`)
int r, g, b, y1192, y, i, uvp, u, v;
        for (int j = 0, yp = 0; j < height; j++) {
            uvp = frameSize + (j >> 1) * width;
            u = 0;
        v = 0;
        for (i = 0; i < width; i++, yp++) {
            y = (0xff & ((int) yuv420sp[yp])) - 16;
            if (y < 0)
                y = 0;
            if ((i & 1) == 0) {
                v = (0xff & yuv420sp[uvp++]) - 128;
                u = (0xff & yuv420sp[uvp++]) - 128;
            }

                y1192 = 1192 * y;
                r = (y1192 + 1634 * v);
                g = (y1192 - 833 * v - 400 * u);
                b = (y1192 + 2066 * u);

// Java's functions are faster then 'IFs'
                    r = Math.max(0, Math.min(r, 262143));
                g = Math.max(0, Math.min(g, 262143));
                b = Math.max(0, Math.min(b, 262143));

                // rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
                // 0xff00) | ((b >> 10) & 0xff);
                // rgba, divide 2^10 ( >> 10)
                rgba[yp] = ((r << 14) & 0xff000000) | ((g << 6) & 0xff0000)
                        | ((b >> 2) | 0xff00);
            }
        }
    }

Speed is comparable to YuvImage.compressToJpeg() with ByteArrayOutputStream as output (30-50 ms for 640x480 image).

Result: Samsung S4 mini (2x1.7GHz) can't compress to JPEG/convert YUV to RGB in real time (640x480@30fps)

JerzySkalski
  • 624
  • 7
  • 9
  • Is the last line right? ((b >> 2) | 0xff00) shouldn't that be ((b >> 2) & 0xff00) ? – SePröbläm Jan 18 '16 at 12:56
  • 1
    I did some measurements with the above NV21 to RGB conversion algorithm on a Nexus 7 (Android 6.0.1). Converting a 1600x1200 pixel preview frame took between 250ms-300ms. Note: Using if/else instead of Math.min/max turned out to be faster. However, the big surprise came with the use of BoofCV library. Converting from NV21 into BoofCV's image class took between 100ms-145ms for color conversion and between 230ms-260ms if the preview frame was also converted to grayscale. In time critical situations, where the preview frame is used to analyze the scene, BoofCV might be worth considering. – SePröbläm Jan 19 '16 at 03:56
6

Java implementation is 10 times slow than the c version, I suggest you use GPUImage library or just move this part of code.

There is a android version of GPUImage: https://github.com/CyberAgent/android-gpuimage

You can include this library if you use gradle, and call the method: GPUImageNativeLibrary.YUVtoRBGA( inputArray, WIDTH, HEIGHT, outputArray);

I compare the time, for a NV21 image which is 960x540, use above java code, it cost 200ms+, with GPUImage version, just 10ms~20ms.

promenade
  • 107
  • 1
  • 7
2

Fixup the above code snippet

static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
                              int height) {
    final int frameSize = width * height;
    int r, g, b, y1192, y, i, uvp, u, v;
    for (int j = 0, yp = 0; j < height; j++) {
        uvp = frameSize + (j >> 1) * width;
        u = 0;
        v = 0;
        for (i = 0; i < width; i++, yp++) {
            y = (0xff & ((int) yuv420sp[yp])) - 16;
            if (y < 0)
                y = 0;
            if ((i & 1) == 0) {
            // above answer is wrong at the following lines. just swap ***u*** and ***v*** 
                u = (0xff & yuv420sp[uvp++]) - 128;
                v = (0xff & yuv420sp[uvp++]) - 128;
            }

            y1192 = 1192 * y;
            r = (y1192 + 1634 * v);
            g = (y1192 - 833 * v - 400 * u);
            b = (y1192 + 2066 * u);

            r = Math.max(0, Math.min(r, 262143));
            g = Math.max(0, Math.min(g, 262143));
            b = Math.max(0, Math.min(b, 262143));

            // combine ARGB
            rgba[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00)
                    | ((b >> 10) | 0xff);
        }
    }
}
HC_ZZ
  • 21
  • 3
  • 1
    There seems to be something wrong in the last line: ((g >>2 6) & 0xff00 | ((b >> 10) | 0xff) Shouldn't that be: ((g >>2) & 0xff00) | ((b >> 10) & 0xff)? – SePröbläm Jan 18 '16 at 13:00
  • Howto use the int[] to load into bitmap? – e-info128 Jan 02 '17 at 22:33
  • @SePröbläm You're right I came to the same conclusion after checking this code. Instead of bitwise or `|` there should be bitewise and `&` in `((b >> 10) | 0xff)` – iaforek Nov 22 '19 at 17:21
1

Try RenderScript ScriptIntrinsicYuvToRGB, which comes with JellyBean 4.2 (Api 17+).

https://developer.android.com/reference/android/renderscript/ScriptIntrinsicYuvToRGB.html

On Nexus 7 (2013, JellyBean 4.3) a 1920x1080 image conversion (full HD camera preview) takes about 7 ms.

Matti81
  • 216
  • 3
  • 3
  • 1
    Unfortunately, the intrinsic conversion is tuned for video processing, not for camera image stream (see [the formulae](https://android.googlesource.com/platform/frameworks/rs/+/refs/heads/master/cpu_ref/rsCpuIntrinsicYuvToRGB.cpp#61)). I have a [fix](https://stackoverflow.com/a/63575255/192373) that resolves this. – Alex Cohn Aug 25 '20 at 09:25
1

You can use ColorHelper library for this:

using ColorHelper;

YUV yuv = new YUV(0.1, 0.1, 0.2);
RGB rgb = ColorConverter.YuvToRgb(yuv);

Links:

progm
  • 2,782
  • 3
  • 14
  • 32
0

You can get the bitmap directly from the TextureView. Which is really fast.

Bitmap bitmap = textureview.getBitmap()
Satheesh
  • 1,252
  • 11
  • 11
0

After reading many suggested links, articles, etc. I found the following great Android example app which captures the YUV Image from the camera and converts it into RGB Bitmap:

https://github.com/android/camera-samples/tree/main/CameraXTfLite

Nice things about this:

  • It uses the aforementioned RenderScript framework and the code can be easily reused - check out the YuvToRgbConverter.kt class
  • according to their documentation, this code achieves " ~30 FPS @ 640x480 on a Pixel 3 phone"

After switching to this code (especially the YUV to RGB conversion part) my framerate doubled! I am not quite reaching 30 FPS overall since I am doing a bit more things after capturing the image, but the speed-up is remarkable!

Dragan S.
  • 1
  • 1