7

I am working with camera2Basic now and trying to get each frame data to do some image processing. I am using camera2 API in Android5.0, everything is fine when only doing the camera preview and it is fluid. But the preview stuck when I use the ImageReader.OnImageAvailableListener callback to get each frame data, this cause a bad User Experience. The following is my related codes:

This is the setup for camera and ImageReader, I set the format of image is YUV_420_888

public<T> Size setUpCameraOutputs(CameraManager cameraManager,Class<T> kClass, int width, int height) {
    boolean flagSuccess = true;
    try {
        for (String cameraId : cameraManager.getCameraIdList()) {
            CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
            // choose the front or back camera
            if (FLAG_CAMERA.BACK_CAMERA == mChosenCamera &&        
                    CameraCharacteristics.LENS_FACING_BACK != characteristics.get(CameraCharacteristics.LENS_FACING)) {
                continue;
            }
            if (FLAG_CAMERA.FRONT_CAMERA == mChosenCamera &&  
                    CameraCharacteristics.LENS_FACING_FRONT != characteristics.get(CameraCharacteristics.LENS_FACING)) {
                continue;
            }
            StreamConfigurationMap map = characteristics.get(
                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);

            Size largestSize = Collections.max(
                    Arrays.asList(map.getOutputSizes(ImageFormat.YUV_420_888)),
                    new CompareSizesByArea());

            mImageReader = ImageReader.newInstance(largestSize.getWidth(), largestSize.getHeight(),
                    ImageFormat.YUV_420_888, 3);

            mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);
            ...
            mCameraId = cameraId;
       }
    } catch (CameraAccessException e) {
        e.printStackTrace();
    } catch (NullPointerException e) {

    }
    ......
}

When the camera opened successfully, I Create a CameraCaptureSession for camera preview

private void createCameraPreviewSession() {
    if (null == mTexture) {
        return;
    }

    // We configure the size of default buffer to be the size of camera preview we want.
    mTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());

    // This is the output Surface we need to start preview
    Surface surface = new Surface(mTexture);

    // We set up a CaptureRequest.Builder with the output Surface.
    try {
        mPreviewRequestBuilder =
                mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
        mPreviewRequestBuilder.addTarget(surface);

        // We create a CameraCaptureSession for camera preview
        mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
                new CameraCaptureSession.StateCallback() {

                    @Override
                    public void onConfigured(CameraCaptureSession session) {
                        if (null == mCameraDevice) {
                            return;
                        }

                        // when the session is ready, we start displaying the preview
                        mCaptureSession = session;

                        // Finally, we start displaying the camera preview
                        mPreviewRequest = mPreviewRequestBuilder.build();
                        try {
                            mCaptureSession.setRepeatingRequest(mPreviewRequest,
                                    mCaptureCallback, mBackgroundHandler);
                        } catch (CameraAccessException e) {
                            e.printStackTrace();
                        }
                    }

                    @Override
                    public void onConfigureFailed(CameraCaptureSession session) {

                    }
                }, null);
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}

The last is the ImageReader.OnImageAvailableListener callback

private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
            @Override
            public void onImageAvailable(ImageReader reader) {
                Log.d(TAG, "The onImageAvailable thread id: " + Thread.currentThread().getId());
                Image readImage = reader.acquireLatestImage();
                readImage.close();
            }
        };

Maybe I do the wrong setup but I try several times and it doesn't work. Maybe there is another way to get frame data rather than ImageReader but I don't know. Anybody knows how to get each frame data in realtime?

CJZ
  • 199
  • 2
  • 9
  • What sort of processing are you trying to do? I think if you want to save the image, use an ImageReader but if you want to do efficient real-time processing, you should send the data to the Surface buffer associated with an Allocation instead. – rcsumner Aug 18 '15 at 04:46
  • @Sumner,I want to get each frame from camera and do face detection rather than saving the image. It seems that the solution you supply is good, can you give more detail? – CJZ Aug 18 '15 at 04:56
  • ...haha not really, sorry. I usually just save the images for my purposes. I know it involves using RenderScript, though. Look at the Allocation class documentation. – rcsumner Aug 18 '15 at 05:00
  • However, also note that many devices offer face detection, though not all. If you have a specific device you are targeting, see if it has it built in via the camera2 API. – rcsumner Aug 18 '15 at 05:03
  • @Sumner, I have the same problem but I try to check for a qr code in the real time, should I use MediaCodec? as a Surface buffer ? – Gutyn Oct 26 '15 at 20:43
  • @CJZ, have you found a solution for your question? I've the same problem. – Manuel Schmitzberger Jun 27 '16 at 07:05
  • What this line is for? mPreviewRequestBuilder.addTarget(mImageReader.getSurface()); – Wasim Ahmed Mar 20 '17 at 07:20

2 Answers2

1

I do not believe that Chen is correct. The image format has almost 0 effect on the speed on the devices I have tested. Instead, the problem seems to be with the image size. On an Xperia Z3 Compact with the image format YUV_420_888, I am offered a bunch of different options in the StreamConfigurationMap's getOutputSizes method:

[1600x1200, 1280x720, 960x720, 720x480, 640x480, 480x320, 320x240, 176x144]

For these respective sizes, the maximum fps I get when setting mImageReader.getSurface() as a target for the mPreviewRequestBuilder are:

[13, 18, 25, 28, 30, 30, 30, 30 ]

So one solution is to use a lower resolution to achieve the rate you want. For the curious... note that these timings do not seem to be affected by the line

    mPreviewRequestBuilder.addTarget(surface);
...
    mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),

I was worried that adding the surface on the screen might be adding overhead, but if I remove that first line and change the second to

    mCameraDevice.createCaptureSession(Arrays.asList(mImageReader.getSurface()),

then I see the timings change by less than 1 fps. So it doesn't seem to matter whether you are also displaying the image on the screen.

I think there is simply some overhead in the camera2 API or ImageReader's framework that makes it impossible to get the full rate that the TextureView is clearly getting.

One of the most disappointing things of all is that, if you switch back to the deprecated Camera API, you can easily get 30 fps by setting up a PreviewCallback via the Camera.setPreviewCallbackWithBuffer method. With that method, I am able to get 30fps regardless of the resolution. Specifically, although it does not offer me 1600x1200 directly, it does offer 1920x1080, and even that is 30fps.

bremen_matt
  • 6,902
  • 7
  • 42
  • 90
  • This might be affected by the amount of processing in the ImageReader's callback and it's buffer size you choose (in ImageReader.newInstance(width, height, format, ). If you set low buffer size (e.g. only 1 frame), camera must wait until your processing is completed, so you might get less FPS. If you use larger buffer (e.g. 3 or 5 frames), you will have more time for processing the single frame without affecting the FPS of the preview. In onImageAvailable() you should also use image.acquireLatestImage() instead of image.acquireNextImage() to not slow down the FPS. – Robyer May 28 '19 at 07:31
  • Are you speaking theoretically or from experience? Perhaps what you talk about is true for newer devices. When I do timing tests, I make sure to do them with an essentially empty callback. My observations are noted above. However, this answer is 2 years old, and it could be that my answer is not relevant to newer devices. – bremen_matt May 28 '19 at 08:11
  • I am speaking from experience, but with newer devices with proper Camera2 support (not LEGACY level) and now I see I was also talking about different situation than you (I was talking about lower FPS in preview surface during processing frames via ImageReader surface). I don't know what your device's Camera2 support was, but I can imagine that if it was LEGACY (which might be basically just wrapper around Camera1 API) then it could bring some performance issues. On current devices I'm getting maximum FPS on any resolution with ImageReader and YUV_420_888. – Robyer May 29 '19 at 10:37
  • I tested 5+ legacy devices, and I definitely was NOT getting max fps. At the time, it was really annoying, because the Android devs kept saying that that was impossible since the Camera2 api is a "thin wrapper" (their words not mine). But I did eventually get my hands on some non-legacy devices to confirm that the issue was indeed isolated to legacy devices. Here is a question at the time: https://stackoverflow.com/questions/41945407/low-fps-with-camera2-api – bremen_matt May 29 '19 at 10:46
  • Interesting. I just realized, when you read this https://developer.android.com/reference/android/graphics/ImageFormat#NV21 they say `This [NV21] is the default format for Camera preview images (...) For the android.hardware.camera2 API, the YUV_420_888 format is recommended for YUV output instead.` So what if they are doing NV21->YUV_420_888 conversion under the hood on legacy devices in your case? That would explain lower framerate and solution might be just to request NV21 instead. – Robyer May 30 '19 at 08:27
  • This is on the fringes of my memory, but I do remember testing with every single format supported by the devices. I don't do Android programming anymore but I think the Camera2 api let's you query for supported formats, and i tried looping over all of those. No difference. If you really want to get to the bottom of this though, you are going to have to find a working legacy device. Like i said, i haven't done Android in more than a year – bremen_matt May 30 '19 at 12:39
0

I'm trying the same things, I think you may change the Format like

mImageReader = ImageReader.newInstance(largestSize.getWidth(),
                                       largestSize.getHeight(),
                                       ImageFormat.FLEX_RGB_888, 3);

Because using the YUV may cause CPU to compress the data and it may cost some time. RGB can be directly showed on the device. And detect face from image should put in other Thread you must know it.

Vasfed
  • 18,013
  • 10
  • 47
  • 53