12

I'm getting preview frames using OnImageAvailableListener:

@Override
public void onImageAvailable(ImageReader reader) {
    Image image = null;
    try {
        image = reader.acquireLatestImage();
        Image.Plane[] planes = image.getPlanes();
        ByteBuffer buffer = planes[0].getBuffer();
        byte[] data = new byte[buffer.capacity()];
        buffer.get(data);
        //data.length=332803; width=3264; height=2448
        Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight());
        //TODO data processing
    } catch (Exception e) {
        e.printStackTrace();
    }
    if (image != null) {
        image.close();
    }
}

Each time length of data is different but image width and height are the same.
Main problem: data.length is too small for such resolution as 3264x2448.
Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.
What is wrong?


imageReader = ImageReader.newInstance(3264, 2448, ImageFormat.JPEG, 5);
Volodymyr Kulyk
  • 6,455
  • 3
  • 36
  • 63

3 Answers3

28

I solved this problem by using YUV_420_888 image format and converting it to JPEG image format manually.

imageReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH, MAX_PREVIEW_HEIGHT, 
                                      ImageFormat.YUV_420_888, 5);
imageReader.setOnImageAvailableListener(this, null);

Surface imageSurface = imageReader.getSurface();
List<Surface> surfaceList = new ArrayList<>();
//...add other surfaces
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(imageSurface);
            surfaceList.add(imageSurface);
cameraDevice.createCaptureSession(surfaceList,
                    new CameraCaptureSession.StateCallback() {
//...implement onConfigured, onConfigureFailed for StateCallback
}, null);

@Override
public void onImageAvailable(ImageReader reader) {
    Image image = reader.acquireLatestImage();
    if (image != null) {
        //converting to JPEG
        byte[] jpegData = ImageUtils.imageToByteArray(image);
        //write to file (for example ..some_path/frame.jpg)
        FileManager.writeFrame(FILE_NAME, jpegData);
        image.close();
    }
}

public final class ImageUtil {

    public static byte[] imageToByteArray(Image image) {
        byte[] data = null;
        if (image.getFormat() == ImageFormat.JPEG) {
            Image.Plane[] planes = image.getPlanes();
            ByteBuffer buffer = planes[0].getBuffer();
            data = new byte[buffer.capacity()];
            buffer.get(data);
            return data;
        } else if (image.getFormat() == ImageFormat.YUV_420_888) {
            data = NV21toJPEG(
                    YUV_420_888toNV21(image),
                    image.getWidth(), image.getHeight());
        }
        return data;
    }

    private static byte[] YUV_420_888toNV21(Image image) {
        byte[] nv21;
        ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
        ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();

        int ySize = yBuffer.remaining();
        int vuSize = vuBuffer.remaining();

        nv21 = new byte[ySize + vuSize];

        yBuffer.get(nv21, 0, ySize);
        vuBuffer.get(nv21, ySize, vuSize);

        return nv21;
    }

    private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
        ByteArrayOutputStream out = new ByteArrayOutputStream();
        YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
        yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
        return out.toByteArray();
    }
}

public final class FileManager {
    public static void writeFrame(String fileName, byte[] data) {
        try {
            BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(fileName));
            bos.write(data);
            bos.flush();
            bos.close();
//            Log.e(TAG, "" + data.length + " bytes have been written to " + filesDir + fileName + ".jpg");
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}
akindofyoga
  • 970
  • 1
  • 9
  • 16
Volodymyr Kulyk
  • 6,455
  • 3
  • 36
  • 63
  • I've used the ```ImageUtil``` class from the answer to convert ```YUV_420_888``` to ```JPEG``` however I am not getting a correct output. By using this class will it be possible to save the output ```byte[]``` into a JPEG file or maybe even display it on a ```SurfaceView```? I am trying to do so however I'm getting distorted images. – ahasbini May 17 '17 at 09:45
  • `data = NV21toJPEG(YUV_420_888toNV21(image), image.getWidth(), image.getHeight());` does distortions? – Volodymyr Kulyk May 17 '17 at 09:55
  • Yes I've placed the class as is and implemented the writing file similar to your implementation but using ```FileOutputStream``` instead, and the final image is distorted. I could show you the complete implementations via git repo, I'll push in a bit. – ahasbini May 17 '17 at 09:58
  • @ahasbini you should create standalone question with your code, it hard to solve your problem in comment section without seeing what you did – Volodymyr Kulyk May 17 '17 at 10:00
  • 1
    This **YUV_420_888toNV21()** is not reliable, see https://stackoverflow.com/a/52740776/192373 – Alex Cohn Dec 30 '18 at 10:13
  • @AlexCohn thank you for comment. Are you having some specific troubles with this method? – Volodymyr Kulyk Jan 02 '19 at 11:23
  • Yes, it only works when the input **Image** has u and v pixel strides of 1, or if it is already in NV21 format. Also, it will fail if the image lines have padding. – Alex Cohn Jan 02 '19 at 19:08
  • 1
    @AlexCohn thanks, I agree with you. I'll test your solution. – Volodymyr Kulyk Jan 03 '19 at 09:46
4

I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).

In my case, I usually transform my image to byte[] in this way.

            Image m_img;
            Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
            Image.Plane Y = m_img.getPlanes()[0];
            Image.Plane U = m_img.getPlanes()[1];
            Image.Plane V = m_img.getPlanes()[2];

            int Yb = Y.getBuffer().remaining();
            int Ub = U.getBuffer().remaining();
            int Vb = V.getBuffer().remaining();

            data = new byte[Yb + Ub + Vb];
            //your data length should be this byte array length.

            Y.getBuffer().get(data, 0, Yb);
            U.getBuffer().get(data, Yb, Ub);
            V.getBuffer().get(data, Yb+ Ub, Vb);
            final int width = m_img.getWidth();
            final int height = m_img.getHeight();

And I use this byte buffer to transform to rgb.

Hope this helps.

Cheers. Unai.

uelordi
  • 2,189
  • 3
  • 21
  • 36
2

Your code is requesting JPEG-format images, which are compressed. They'll change in size for every frame, and they'll be much smaller than the uncompressed image. If you want to do nothing besides save JPEG images, you can just save what you have in the byte[] data to disk and you're done.

If you want to actually do something with the JPEG, you can use BitmapFactory.decodeByteArray() to convert it to a Bitmap, for example, though that's pretty inefficient.

Or you can switch to YUV, which is more efficient, but you need to do more work to get a Bitmap out of it.

Eddy Talvala
  • 17,243
  • 2
  • 42
  • 47
  • Converting to Bitmap and back each frame just for decoding is not an option. – Volodymyr Kulyk Oct 22 '16 at 09:53
  • Your current solution seems to be to reshuffle the YUV data to NV21, and then compressing it to a JPEG; it's not clear what you want the YUV for, but it's certainly cheaper to just ask for JPEG if all you want to do is to save the data. You can ask for both JPEG and YUV, depending on resolution, which might be most efficient option, though JPEG output may have a lower frame rate than 30fps. – Eddy Talvala Oct 24 '16 at 00:17
  • I'm using NV21 for QR recognition / decoding. JPEG for sending frames to the server. – Volodymyr Kulyk Oct 24 '16 at 07:31