36

I've implemented a simple application which shows the camera picture on the screen. What I like to do now is grab a single frame and process it as bitmap. From what I could find out to this point it is not an easy thing to do.

I've tried using the onPreviewFrame method with which you get the current frame as a byte array and tried to decode it with the BitmapFactory class but it returns null. The format of the frame is a headerless YUV which could be translated to bitmap but it takes too long on a phone. Also I've read that the onPreviewFrame method has contraints on the runtime, if it takes too long the application could crash.

So what is the right way to do this?

Alexander Stolz
  • 7,454
  • 12
  • 57
  • 64
  • Using ffmpeg we can do it in native code. – Vinay Dec 14 '09 at 15:15
  • Hi @Vinay, am trying it from so many days but still no result, if you wrked on it then can you please refer this http://stackoverflow.com/questions/11322952/decoding-video-using-ffmpeg-for-android i did untill this, if you want to refer coe then i will push it to git. – Rahul Upadhyay Jul 07 '12 at 06:20

5 Answers5

34

Ok what we ended up doing is using the onPreviewFrame method and decoding the data in a seperate Thread using a method which can be found in the android help group.

decodeYUV(argb8888, data, camSize.width, camSize.height);
Bitmap bitmap = Bitmap.createBitmap(argb8888, camSize.width,
                    camSize.height, Config.ARGB_8888);

...

// decode Y, U, and V values on the YUV 420 buffer described as YCbCr_422_SP by Android 
// David Manpearl 081201 
public void decodeYUV(int[] out, byte[] fg, int width, int height)
        throws NullPointerException, IllegalArgumentException {
    int sz = width * height;
    if (out == null)
        throw new NullPointerException("buffer out is null");
    if (out.length < sz)
        throw new IllegalArgumentException("buffer out size " + out.length
                + " < minimum " + sz);
    if (fg == null)
        throw new NullPointerException("buffer 'fg' is null");
    if (fg.length < sz)
        throw new IllegalArgumentException("buffer fg size " + fg.length
                + " < minimum " + sz * 3 / 2);
    int i, j;
    int Y, Cr = 0, Cb = 0;
    for (j = 0; j < height; j++) {
        int pixPtr = j * width;
        final int jDiv2 = j >> 1;
        for (i = 0; i < width; i++) {
            Y = fg[pixPtr];
            if (Y < 0)
                Y += 255;
            if ((i & 0x1) != 1) {
                final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
                Cb = fg[cOff];
                if (Cb < 0)
                    Cb += 127;
                else
                    Cb -= 128;
                Cr = fg[cOff + 1];
                if (Cr < 0)
                    Cr += 127;
                else
                    Cr -= 128;
            }
            int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
            if (R < 0)
                R = 0;
            else if (R > 255)
                R = 255;
            int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1)
                    + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
            if (G < 0)
                G = 0;
            else if (G > 255)
                G = 255;
            int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
            if (B < 0)
                B = 0;
            else if (B > 255)
                B = 255;
            out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
        }
    }

}

Link: http://groups.google.com/group/android-developers/browse_thread/thread/c85e829ab209ceea/3f180a16a4872b58?lnk=gst&q=onpreviewframe#3f180a16a4872b58

Alexander Stolz
  • 7,454
  • 12
  • 57
  • 64
  • 2
    Hey Alexander, are you doing any other processing or voodoo magic in the background? I've tried this and end up with an image like this: https://dl.dropbox.com/u/6620976/image.jpg. Any insight is appreciated. – Brian D Feb 28 '11 at 19:26
  • Nevermind! I just needed to define my surface's width and height and it all works great. Thanks for posting this up! – Brian D Feb 28 '11 at 20:02
  • @Brian D: I am facing the same problem, can you please show me your surface code? – Mudassir May 23 '11 at 05:12
  • 1
    For a native implementation, one can look at [libyuv](http://code.google.com/p/libyuv/). – auselen Sep 16 '12 at 21:25
  • hi. Is it ARGB format or RGB ? How do we do if we only want RGB with the following code ? Thanks a lot for your answer by the way, really help. – hico Aug 08 '13 at 15:35
  • 2
    What is the array "argb8888" that must be passed to the decodeYUV method? – user140888 Apr 09 '15 at 10:38
  • This seems to work for me only sometimes. I've defined the Picture size to be the minimum the camera will take. But I don't know what the `width`, `height` parameters of this conversion method should be. Suppose the picture is `320x240` res. And the data in is an array of bytes of size `8500`. What should the `width`, `height` restrictions be as an argument to the above method? – Greg Peckory Jan 18 '16 at 15:27
  • Also what should the array length be for int[] out, before I pass it through? – Greg Peckory Jan 18 '16 at 15:29
  • int[] size should be width * height – Mereb Hayl Jun 29 '20 at 08:55
14

In API 17+, you can do conversion to RGBA888 from NV21 with the 'ScriptIntrinsicYuvToRGB' RenderScript. This allows you to easily process preview frames without manually encoding/decoding frames:

@Override 
public void onPreviewFrame(byte[] data, Camera camera) { 
   Bitmap bitmap = Bitmap.createBitmap(r.width(), r.height(), Bitmap.Config.ARGB_8888);
    Allocation bmData = renderScriptNV21ToRGBA888(
        mContext,
        r.width(),
        r.height(),
        data);
    bmData.copyTo(bitmap);
}

public Allocation renderScriptNV21ToRGBA888(Context context, int width, int height, byte[] nv21) {
  RenderScript rs = RenderScript.create(context);
  ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));

  Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(nv21.length);
  Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);

  Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(width).setY(height);
  Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);

  in.copyFrom(nv21);

  yuvToRgbIntrinsic.setInput(in);
  yuvToRgbIntrinsic.forEach(out);
  return out;
}
Tim
  • 875
  • 10
  • 15
11

I actually tried the code given the previous answer found that the Colorvalues are not exact. I checked it by taking both the preview and the camera.takePicture which directly returns a JPEG array. And the colors were very different. After a little bit more searching I found another example to convert the PreviewImage from YCrCb to RGB:

static public void decodeYUV420SP(int[] rgb, byte[] yuv420sp, int width, int height) {
    final int frameSize = width * height;

    for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
        for (int i = 0; i < width; i++, yp++) {
            int y = (0xff & ((int) yuv420sp[yp])) - 16;
            if (y < 0) y = 0;
            if ((i & 1) == 0) {
                v = (0xff & yuv420sp[uvp++]) - 128;
                u = (0xff & yuv420sp[uvp++]) - 128;
            }
            int y1192 = 1192 * y;
            int r = (y1192 + 1634 * v);
            int g = (y1192 - 833 * v - 400 * u);
            int b = (y1192 + 2066 * u);

            if (r < 0) r = 0; else if (r > 262143) r = 262143;
            if (g < 0) g = 0; else if (g > 262143) g = 262143;
            if (b < 0) b = 0; else if (b > 262143) b = 262143;

            rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
        }
    }
}

The color values given by this and the takePicture() exactly match. I thought I should post it here. This is where I got this code from. Hope this helps.

Codevalley
  • 4,593
  • 7
  • 42
  • 56
  • 3
    Thanks - I ended up using the code in [post #37](https://code.google.com/p/android/issues/detail?id=823#c37) instead (it was posted long after this answer was, but for those of us in the future it works nicely). – Matt Apr 12 '13 at 15:06
2

Tim's RenderScript solution is great. Two comments here though:

  1. Create and reuse RenderScript rs, and Allocation in, out. Creating them every frame will hurt the performance.
  2. RenderScript support library can help you back support to Android 2.3.
Miao Wang
  • 1,120
  • 9
  • 12
0

I don't see any of the answers better in performance than the built-in way to convert it. You can get the bitmap using this.

        Camera.Parameters params = camera.getParameters();
        Camera.Size previewsize = params.getPreviewSize();
        YuvImage yuv = new YuvImage(data, ImageFormat.NV21, previewsize.width, previewsize.height, null);
        ByteArrayOutputStream stream = new ByteArrayOutputStream();
        
        yuv.compressToJpeg(new Rect(0,0,previewsize.width, previewsize.height), 100, stream);

        byte[] buf = stream.toByteArray();
        Bitmap bitmap = BitmapFactory.decodeByteArray(buf, 0, buf.length);
Mereb Hayl
  • 58
  • 7