1

We are obtaining our image from the takePicture() function - here we are using here the jpeg callback (the third parameter) since we were not able to obtain the raw image - even after setting the callback buffer to maximum size. So the image gets compressed to JPEG format - we on the other hand need our image to be the same format as the preview frames: YCbCr_420_SP (NV21) (this format is expected by a third party library we use and we don't have the resources for a reimplementation)

We tried to set the picture format in the parameters when initializing the camera with setPictureFormat(), which saddly didn't help. I guess this function only applies to the raw callback.

We have access to the OpenCV C library on the JNI side, but don't know how to implement the conversion with IplImage.

So currently we are using the following java implementation for the conversion, which has really poor performance (about 2 seconds for a picture with dimensions of 3840x2160):

byte [] getNV21(int inputWidth, int inputHeight, Bitmap scaled) {
    int [] argb = new int[inputWidth * inputHeight];
    scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
    byte [] yuv = new byte[inputWidth*inputHeight*3/2];
    encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
    scaled.recycle();

    return yuv;
}

void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
    final int frameSize = width * height;

    int yIndex = 0;
    int uvIndex = frameSize;
    int R, G, B, Y, U, V;
    int index = 0;
    for (int j = 0; j < height; j++) {
        for (int i = 0; i < width; i++) {
            R = (argb[index] & 0xff0000) >> 16;
            G = (argb[index] & 0xff00) >> 8;
            B = (argb[index] & 0xff) >> 0;

            // well known RGB to YUV algorithm
            Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
            U = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
            V = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

            // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
            //    meaning for every 4 Y pixels there are 1 V and 1 U.  Note the sampling is every other
            //    pixel AND every other scanline.
            yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
            if (j % 2 == 0 && index % 2 == 0) {
                yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
                yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
            }

            index ++;
        }
    }
}

Does someone know how the conversion would look like with the help of OpenCV C or alternatively can offer a more efficient java implementation?

Update: After reimplementing the camera class to use the camera2 API we are receiving an Image object of the format YUV_420_888. We are using then the following function for conversion to NV21:

private static byte[] YUV_420_888toNV21(Image image) {
    byte[] nv21;
    ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
    ByteBuffer uBuffer = image.getPlanes()[1].getBuffer();
    ByteBuffer vBuffer = image.getPlanes()[2].getBuffer();

    int ySize = yBuffer.remaining();
    int uSize = uBuffer.remaining();
    int vSize = vBuffer.remaining();

    nv21 = new byte[ySize + uSize + vSize];

    //U and V are swapped
    yBuffer.get(nv21, 0, ySize);
    vBuffer.get(nv21, ySize, vSize);
    uBuffer.get(nv21, ySize + vSize, uSize);

    return nv21;
}

While this function works fine with cameraCaptureSessions.setRepeatingRequest, we get a segmentation error when calling cameraCaptureSessions.capture. Both request YUV_420_888 format via ImageReader.

Alexander Belokon
  • 1,452
  • 2
  • 17
  • 37

1 Answers1

1

There are two ways to improve your result.

  • You can use jpeg-turbo library to obtain YUV from Jpeg directly, the function is tjDecompressToYUV.

  • You can use renderscript to convert bitmaps to YUV.

Which one will be better for you, depends on the device. Some will have hardware-accelerated Jpeg decoder for Java, some will use libjpeg in software. In the latter case, tjDecompressToYUV will deliver a significant improvement.

If your device runs Android-5 or higher, consider switching to the camera2 API, ImageReader may be able to deliver YUV or RAW images of desired resolution and quality.

Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
  • So we reimplemented our class to use the camera2 API, now we are using an ImageReader to obtain the picture. In the callback we get an Image object while needing a byte array. Is there an efficient conversion between these two or are we again at the beginning of our problem? – Alexander Belokon Oct 09 '18 at 11:33
  • 1
    **camera2** delivers image in YUV420_888 format, which is explained with beutiful pictures [here](https://stackoverflow.com/q/36212904/192373). If you are lucky, there will be 0 padding. If you are very lucky, the UV plane will follow the Y plane with no alignment correction. In this case, you can consider the whole object as holding NV21 byte array for you; if you are less lucky, you need to allocate a new array and copy pixels, skipping the garbage. Even this is much much faster than JPEG manipulations and color conversions. – Alex Cohn Oct 09 '18 at 11:46
  • I updated the question showing my approach, we got still issues capturing a picture while processing frames works fine – Alexander Belokon Oct 09 '18 at 16:09
  • IMHO, this segfault justifies a separate question. But in the nutshell, your new code ignores `planes[xxx].getRowStride()` and `planes[xxx].getPixelStride()` (see the linked source https://stackoverflow.com/q/36212904/192373 again). For example, the U and V planes often overlap, just as in NV21. – Alex Cohn Oct 09 '18 at 16:26
  • So does this mean I would have to iterate with a given stride through the U/V buffer? How come the result of both function calls is different while the requested type is the same? I opened a new question for further investigation: https://stackoverflow.com/q/52726002/4351182 - thank you for your support – Alexander Belokon Oct 09 '18 at 17:03