3

I need to do some real-time image processing with the camera preview data, such as face detection which is a c++ library, and then display the processed preview with face labeled on screen.

I have read http://nezarobot.blogspot.com/2016/03/android-surfacetexture-camera2-opencv.html and Eddy Talvala's answer from Android camera2 API - Display processed frame in real time. Following the two webpages, I managed to build the app(no calling the face detection lib, only trying to display preview using ANativeWindow), but everytime I run this app on Google Pixel - 7.1.0 - API 25 running on Genymotion, the app always collapses throwing the following log

08-28 14:23:09.598 2099-2127/tau.camera2demo A/libc: Fatal signal 11 (SIGSEGV), code 2, fault addr 0xd3a96000 in tid 2127 (CAMERA2)
                  [ 08-28 14:23:09.599   117:  117 W/         ]
                  debuggerd: handling request: pid=2099 uid=10067 gid=10067 tid=2127

I googled this but no answer found.

The whole project on Github:https://github.com/Fung-yuantao/android-camera2demo

Here is the key code(I think).

Code in Camera2Demo.java:

private void startPreview(CameraDevice camera) throws CameraAccessException {
    SurfaceTexture texture = mPreviewView.getSurfaceTexture();

    // to set PREVIEW size
    texture.setDefaultBufferSize(mPreviewSize.getWidth(),mPreviewSize.getHeight());
    surface = new Surface(texture);
    try {
        // to set request for PREVIEW
        mPreviewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }

    mImageReader = ImageReader.newInstance(mImageWidth, mImageHeight, ImageFormat.YUV_420_888, 2);

    mImageReader.setOnImageAvailableListener(mOnImageAvailableListener,mHandler);

    mPreviewBuilder.addTarget(mImageReader.getSurface());

    //output Surface
    List<Surface> outputSurfaces = new ArrayList<>();
    outputSurfaces.add(mImageReader.getSurface());

    /*camera.createCaptureSession(
            Arrays.asList(surface, mImageReader.getSurface()),
            mSessionStateCallback, mHandler);
            */
    camera.createCaptureSession(outputSurfaces, mSessionStateCallback, mHandler);
}


private CameraCaptureSession.StateCallback mSessionStateCallback = new CameraCaptureSession.StateCallback() {

    @Override
    public void onConfigured(CameraCaptureSession session) {
        try {
            updatePreview(session);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onConfigureFailed(CameraCaptureSession session) {

    }
};

private void updatePreview(CameraCaptureSession session)
        throws CameraAccessException {
    mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);

    session.setRepeatingRequest(mPreviewBuilder.build(), null, mHandler);
}


private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {

    @Override
    public void onImageAvailable(ImageReader reader) {
        // get the newest frame
        Image image = reader.acquireNextImage();

        if (image == null) {
            return;
        }

        // print image format
        int format = reader.getImageFormat();
        Log.d(TAG, "the format of captured frame: " + format);

        // HERE to call jni methods
        JNIUtils.display(image.getWidth(), image.getHeight(), image.getPlanes()[0].getBuffer(), surface);


        //ByteBuffer buffer = image.getPlanes()[0].getBuffer();
        //byte[] bytes = new byte[buffer.remaining()];


        image.close();
    }
};

Code in JNIUtils.java:

import android.media.Image;
import android.view.Surface;

import java.nio.ByteBuffer;


public class JNIUtils {
    // TAG for JNIUtils class
    private static final String TAG = "JNIUtils";

    // Load native library.
    static {
        System.loadLibrary("native-lib");
    }

    public static native void display(int srcWidth, int srcHeight, ByteBuffer srcBuffer, Surface surface);
}

Code in native-lib.cpp:

#include <jni.h>
#include <string>
#include <android/log.h>
//#include <android/bitmap.h>
#include <android/native_window_jni.h>

#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, "Camera2Demo", __VA_ARGS__)

extern "C" {
JNIEXPORT jstring JNICALL Java_tau_camera2demo_JNIUtils_display(
        JNIEnv *env,
        jobject obj,
        jint srcWidth,
        jint srcHeight,
        jobject srcBuffer,
        jobject surface) {
    /*
    uint8_t *srcLumaPtr = reinterpret_cast<uint8_t *>(env->GetDirectBufferAddress(srcBuffer));

    if (srcLumaPtr == nullptr) {
        LOGE("srcLumaPtr null ERROR!");
        return NULL;
    }
    */

    ANativeWindow * window = ANativeWindow_fromSurface(env, surface);
    ANativeWindow_acquire(window);

    ANativeWindow_Buffer buffer;

    ANativeWindow_setBuffersGeometry(window, srcWidth, srcHeight, 0/* format unchanged */);

    if (int32_t err = ANativeWindow_lock(window, &buffer, NULL)) {
        LOGE("ANativeWindow_lock failed with error code: %d\n", err);
        ANativeWindow_release(window);
        return NULL;
    }

    memcpy(buffer.bits, srcBuffer,  srcWidth * srcHeight * 4);


    ANativeWindow_unlockAndPost(window);
    ANativeWindow_release(window);

    return NULL;
}
}

After I commented the memcpy out, the app no longer collapses but displays nothing. So I guess the problem is now turning to how to correctly use memcpy to copy the captured/processed buffer to buffer.bits.

Update:

I change

memcpy(buffer.bits, srcBuffer, srcWidth * srcHeight * 4);

to

memcpy(buffer.bits, srcLumaPtr, srcWidth * srcHeight * 4);

the app no longer collapses and starts to display but it's displaying something strange.

fytao
  • 181
  • 3
  • 14
  • Does your app have CAMERA permission? – Marcos Vasconcelos Aug 28 '17 at 20:28
  • @MarcosVasconcelos sure – fytao Aug 29 '17 at 02:46
  • I would suspect the memcpy - try to comment it out, and if it no longer crashes, make sure you do not copy more than you actually can (I would make sure the format is the correct one) – yakobom Aug 29 '17 at 14:26
  • @yakobom sorry, the memcpy is the problem. After I comment it out, the app no longer collapse but display nothing. I still have no idea how to correctly use this memcpy to copy captured/processed buffer to the `ANativeWindow_Buffer`. I just started to learn android. And there are few documents about ANativeWindow on the Internet. – fytao Aug 29 '17 at 17:17
  • I'm not sure if ANativeWindow is your problem, it is very straight forward (ANativeWindow_setBuffersGeometry might be the only 'tricky' part). If memcpy crashes, you either try to copy more bytes than available in your buffer, thus causing memory overrun, or there's something wrong with the surface and you are not allowed to copy. This is what I can think of. – yakobom Aug 30 '17 at 05:02
  • @yakobom It turns out that I am not copying extra bytes according to this answer https://stackoverflow.com/questions/27588389/surfacetexture-surface-mapping-with-anativewindow. After I replace the parameter srcBuffer with a uint8_t pointer `srcLumaPtr` which is casted from scrBuffer as the line commented in native-lib.cpp, the memcpy works fine and the app starts to display but display weird image. – fytao Aug 30 '17 at 11:27

2 Answers2

2

Just to put it as an answer, to avoid a long chain of comments - such a crash issue may be due to improper size of bites being copied by the memcpy (UPDATE following other comments: In this case it was due to forbidden direct copy).

If you are now getting a weird image, it is probably another issue - I would suspect the image format, try to modify that.

yakobom
  • 2,681
  • 1
  • 25
  • 33
  • The image is probably [YUV_420_888](https://developer.android.com/reference/android/graphics/ImageFormat.html#YUV_420_888), and the stride length should be accounted for. – Alex Cohn Aug 30 '17 at 20:32
  • 1
    Native buffer is in one of [AHARDWAREBUFFER_FORMAT_*](https://developer.android.com/ndk/reference/group___native_activity.html#ga99fb83031ce9923c84392b4e92f956b5) RGB formats. Cameras rarely support RGB output, so you cannot copy camera output to buffer.bits directly. You should add YUV→RGB conversion. – Alex Cohn Aug 30 '17 at 20:48
  • @AlexCohn What should I do if I just want to display the Y plane? As most conversion methods on the Internet are converting YUV to RGB. – fytao Aug 31 '17 at 13:53
  • If you only need grayscale, take the usual formula, but for U and V use fixed value of 127. – Alex Cohn Aug 31 '17 at 14:25
2

As mentioned by yakobom, you're trying to copy a YUV_420_888 image directly into a RGBA_8888 destination (that's the default, if you haven't changed it). That won't work with just a memcpy.

You need to actually convert the data, and you need to ensure you don't copy too much - the sample code you have copies width*height*4 bytes, while a YUV_420_888 image takes up only stride*height*1.5 bytes (roughly). So when you copied, you were running way off the end of the buffer.

You also have to account for the stride provided at the Java level to correctly index into the buffer. This link from Microsoft has a useful diagram.

If you just care about the luminance (so grayscale output is enough), just duplicate the luminance channel into the R, G, and B channels. The pseudocode would be roughly:

uint8_t *outPtr = buffer.bits;
for (size_t y = 0; y < height; y++) {
   uint8_t *rowPtr = srcLumaPtr + y * srcLumaStride;
   for (size_t x = 0; x < width; x++) {
      *(outPtr++) = *rowPtr;
      *(outPtr++) = *rowPtr;
      *(outPtr++) = *rowPtr;
      *(outPtr++) = 255; // gamma for RGBA_8888
      ++rowPtr;
    }
}

You'll need to read the srcLumaStride from the Image object (row stride of the first Plane) and pass it down via JNI as well.

Eddy Talvala
  • 17,243
  • 2
  • 42
  • 47
  • Thanks a lot for your detailed answer and now my demo is working fine. I still have one more question: The YUV2RGBA conversion I found [here](https://github.com/CyberAgent/android-gpuimage/blob/master/library/jni/yuv-decoder.c) does not mention the stride at all. Is it because that conversion is dealing with the whole YUV array while I'm dealing with each plane separately? – fytao Sep 05 '17 at 10:34
  • It seems that the image buffer that I'm dealing with has no padding, which means that the stride is equal to the width. But I guess I should still assume that the image buffer has padding in consideration of robustness. – fytao Sep 06 '17 at 11:55