3

I implemented Camera.PreviewCallback in which I then get the raw image (YV12 or NV12 format) as a byte array. I'm looking for a way to crop part of that image without converting it to a bitmap. The cropped part of the image will be streamed somewhere else (as byte array again).

Any help appreciated.

public class CameraAccess implements Camera.PreviewCallback, LoaderCallbackInterface {

private byte[] lastFrame;

@Override
public void onPreviewFrame(byte[] frame, Camera arg1) {
    synchronized(this) {
       this.lastFrame = frame;

    }
}

@Override
public byte[] cropFrame(Integer x, Integer y, Integer width, Integer height) {
    synchronized(this) {
       // how to crop directly from byte array?

    }
}

}

Matthias
  • 5,574
  • 8
  • 61
  • 121

2 Answers2

4

and image as byte array is simply each pixel of the image in a huge array. It starts from the top left pixel and travels to the right side and then next line down (back at the left side).

So to crop it it's just a matter of copying the pixels you want to a new byte array with some for loop:

Rect cropArea = ... //the are to crop
int currentPos = 0;
byte[] croppedOutput = new byte[cropArea.width() * cropArea.height()];
for(int y = 0; y < height; y++){
  for(int x = 0; x < width; x++){
      // here you compare if x and y are within the crop area you want
    if(cropArea.contains(x, y)){
       croppedOutput[currentPos] = frame[positionInArrayForXY(x, y)]
    }
  } 
}

there's some extra math you have to do for the method positionInArrayForXY that is pretty much x * y but then have to take into account when the value is zero and stuff.

ps.: I believe that the frame is 1 byte per pixel, but not sure about it, so if it's 2 bytes per pixel there's some extra math on it. But the idea is the same and you can develop from it.

edit:

answering to your comment: No, there's no header in this thing, it's just the pixels straight away. That's why it always gives you also the camera information, so you can know the sizing.

For sure it doesn't fir my answer, when I answer that I was expecting the YUV to follow the array order as the RGB does.

I did some extra research and here you can see the method that does the YUV to RGB conversion, and if you check it closely, you will notice that it uses every 12 bits, which is 1.5 bytes => 921600 * 1.5 = 1382400

so based on that I can think of a few ways to go:

  • (easiest to implement) transform your frame to RGB (I know you specify that u didn't want , but it will be easier to) and do the crop as per my answer and then stream it.
  • (biggest overhead, not so easy at all) if the receiver of the stream MUST receive in YUV, do the above but convert it back to YUV doing the invert operation of the linked method before stream.
  • (very tricky to implement, but solve as per your original question) on the light of my example code, the code on the link I posted and the fact that it takes 12 bits per pixel develop the code with the 2 for loops to do the crop.
Community
  • 1
  • 1
Budius
  • 39,391
  • 16
  • 102
  • 144
  • But usually the header should contain some information about the image format, or? Anyway, my preview is 1280x720 which should result in an array length of 921600 bytes. But the preview frame is 1382399 bytes long. That does not fit your solution, right? – Matthias Jan 14 '14 at 08:50
  • 1
    Thanks for your advanced research. Good information. I will try to crop directly from YUV byte array. I want to avoid RGB conversion on the sender's side because it will use the CPU too much then. – Matthias Jan 14 '14 at 12:10
  • Works fine, but I'm wondering about it's performance. Because it looks at every pixel of the original image but both for-loop could just iterate through the crop region if the start and limit for the for-loop is calculated correctly. – Matthias Jan 14 '14 at 12:27
  • I totally agree with you. If you already know how to map a XY position to the YUV array, next optimization step is to only iterate on that area. Ads complication to the code, but for sure it's a better final solution. – Budius Jan 14 '14 at 13:17
  • Hi @Matthias , could you please tell us, which option you choose from three above mentioned? If the third, would request you some code snippets. – Mahendra Chhimwal Apr 19 '17 at 05:26
  • 1
    Hi @MahendraChhimwal, tough request since my code is from 2012 and 2014 and I don't have a public Github repository. Anyway, I answered my own question with a new answer. You will find anything you need in there. Hope this helps. – Matthias Apr 19 '17 at 15:17
1

Someone asked for my final solution and some source code. So here's what I did.

Scenario: My project was based on a System-on-a-Chip running Android. I implemented camera handling both for a local camera that is connected to the board via USB. That camera works as the camera on an Android Smartphone. The second one was an IP based camera streaming images over the network. Therefore the software design might look a bit confusing. Feel free to ask questions.

Solution: Because OpenCV handling, camera initialization and color and bitmap conversion is a tricky thing, I ended up encapsulating everything into two classes thus avoiding stupid code at multiple occasions in my Android code.

The first class handles the color/bitmap and OpenCV matrix conversions. It's defined as:

import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.core.Mat;    
import android.graphics.Bitmap;

public interface CameraFrame extends CvCameraViewFrame {
    Bitmap toBitmap();

    @Override
    Mat rgba();

    @Override
    Mat gray();
}

All color and bitmap conversion is within the implementation of this interface. The actual conversion is done by the utils that are shipped with OpenCV for Android. You will see that I'm using one Bitmap only. This is because of resource savings and bitmap conversion that are CPU intense. All UI components are displaying/rendering this bitmap. And the conversion is only done if any components requests the bitmap.

private class CameraAccessFrame implements CameraFrame {
    private Mat mYuvFrameData;
    private Mat mRgba;
    private int mWidth;
    private int mHeight;
    private Bitmap mCachedBitmap;
    private boolean mRgbaConverted;
    private boolean mBitmapConverted;

    @Override
    public Mat gray() {
        return mYuvFrameData.submat(0, mHeight, 0, mWidth);
    }

    @Override
    public Mat rgba() {
        if (!mRgbaConverted) {
            Imgproc.cvtColor(mYuvFrameData, mRgba,
                    Imgproc.COLOR_YUV2BGR_NV12, 4);
            mRgbaConverted = true;
        }
        return mRgba;
    }

    // @Override
    // public Mat yuv() {
    // return mYuvFrameData;
    // }

    @Override
    public synchronized Bitmap toBitmap() {
        if (mBitmapConverted)
            return mCachedBitmap;

        Mat rgba = this.rgba();
        Utils.matToBitmap(rgba, mCachedBitmap);

        mBitmapConverted = true;
        return mCachedBitmap;
    }

    public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
        super();
        mWidth = width;
        mHeight = height;
        mYuvFrameData = Yuv420sp;
        mRgba = new Mat();

        this.mCachedBitmap = Bitmap.createBitmap(width, height,
                Bitmap.Config.ARGB_8888);
    }

    public synchronized void put(byte[] frame) {
        mYuvFrameData.put(0, 0, frame);
        invalidate();
    }

    public void release() {
        mRgba.release();
        mCachedBitmap.recycle();
    }

    public void invalidate() {
        mRgbaConverted = false;
        mBitmapConverted = false;
    }
};

The camera handling is encapsulated in two special classes, which are explained later on. One (HardwareCamera implements ICamera) handles the camera initialization and shutdown while the second one (CameraAccess) handles OpenCV initialization and the notification of other components (CameraCanvasView extends CanvasView implements CameraFrameCallback) that are interested in receiving camera images and displaying them in an Android view (UI). Such components have to hook up (register) onto that class.

The callback (implemented by any UI component) is defined as this:

public interface CameraFrameCallback {
    void onCameraInitialized(int frameWidth, int frameHeight);

    void onFrameReceived(CameraFrame frame);

    void onCameraReleased();
}

The implementation of this interface is done by UI components like this:

import android.content.Context;
import android.util.AttributeSet;
import android.view.SurfaceHolder;
import CameraFrameCallback;

public class CameraCanvasView extends CanvasView implements CameraFrameCallback {

    private CameraAccess mCamera;
    private int cameraWidth = -1;
    private int cameraHeight = -1;
    private boolean automaticReceive;
    private boolean acceptNextFrame;

    public CameraCanvasView(Context context, AttributeSet attributeSet) {
        super(context, attributeSet);
    }

    public CameraAccess getCamera() {
        return mCamera;
    }

    public boolean getAcceptNextFrame() {
        return acceptNextFrame;
    }

    public void setAcceptNextFrame(boolean value) {
        this.acceptNextFrame = value;
    }

    public void setCamera(CameraAccess camera, boolean automaticReceive) {
        if (camera == null)
            throw new NullPointerException("camera");

        this.mCamera = camera;
        this.mCamera.setAutomaticReceive(automaticReceive);
        this.automaticReceive = automaticReceive;
    }

    @Override
    public void onCameraInitialized(int frameWidth, int frameHeight) {
        cameraWidth = frameWidth;
        cameraHeight = frameHeight;

        setCameraBounds();
    }

    public void setCameraBounds() {

        int width = 0;
        int height = 0;
        if (fixedWidth > 0 && fixedHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        } else if (cameraWidth > 0 && cameraHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        }

        if (width > 0 && height > 0)
            super.setCameraBounds(width, height, true);
    }

    @Override
    public void onFrameReceived(CameraFrame frame) {
        if (acceptNextFrame || automaticReceive)
            super.setBackground(frame);

        // reset
        acceptNextFrame = false;
    }

    @Override
    public void onCameraReleased() {

        setBackgroundImage(null);
    }

    @Override
    public void surfaceCreated(SurfaceHolder arg0) {
        super.surfaceCreated(arg0);

        if (mCamera != null) {
            mCamera.addCallback(this);

            if (!automaticReceive)
                mCamera.receive(); // we want to get the initial frame
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder arg0) {
        super.surfaceDestroyed(arg0);

        if (mCamera != null)
            mCamera.removeCallback(this);
    }
}

That UI component can be used in XML layout like this:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical" >

    <eu.level12.graphics.laser.CameraCanvasView
        android:id="@+id/my_camera_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        />

</LinearLayout>

The underlying CanvasView responsible for drawing the camera image/bitmap onto an Android UI surface, which is another tricky thing and therefore encapsulated. I'm sorry that I can not add the full solution here, since this would be too much code.

Anyway, let's get back to the camera handling. The linking between UI components and the camera is done by the CameraAccess class that is also loading OpenCV on application startup.

import java.util.ArrayList;
import java.util.List;

import org.opencv.android.InstallCallbackInterface;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;

import android.content.Context;
import android.content.SharedPreferences;
import android.content.SharedPreferences.OnSharedPreferenceChangeListener;
import android.graphics.Rect;
import android.preference.PreferenceManager;
import android.text.TextUtils;
import android.util.Log;

public final class CameraAccess implements OnSharedPreferenceChangeListener,
        LoaderCallbackInterface {

    public static final int CAMERA_INDEX_IP = Integer.MAX_VALUE;
    private static final int CAM_NONE = -1;
    private static final int CAM_DEFAULT = 0;
    private static final String DEFAULT_IP = "127.0.0.1";

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private final static String TAG = "CameraAccess";
    private Context context;
    private int cameraIndex;
    private String cameraURI;
    private List<CameraFrameCallback> mCallbacks = new ArrayList<CameraFrameCallback>();
    private List<IOpenCVLoadedCallback> mLoadedCallbacks = new ArrayList<IOpenCVLoadedCallback>();
    private SharedPreferences preferences;
    private ICamera camera;
    private int mFrameWidth;
    private int mFrameHeight;
    private boolean mOpenCVloaded;
    private boolean isFixed;
    private boolean isDirty;
    private final Rect roi = new Rect();
    private final ManualResetEvent automaticReceive = new ManualResetEvent(true);
    private final AutoResetEvent doReceive = new AutoResetEvent(true);

    private static CameraAccess mInstance;

    public static CameraAccess getInstance(Context context) {

        if (mInstance != null) {
            if (mInstance.isDirty) {
                if (!mInstance.isFixed) {
                    mInstance.releaseCamera();
                    mInstance.connectCamera();
                }

                mInstance.isDirty = false;
            }

            return mInstance;
        }

        mInstance = new CameraAccess(context);

        mInstance.isFixed = false;
        mInstance.connectCamera();

        return mInstance;
    }

    public static CameraAccess getIPCamera(Context context, String uri) {
        if (mInstance != null
                && Utils.as(NetworkCamera.class, mInstance) == null)
            throw new IllegalStateException(
                    "Camera already initialized as non-network/IP.");

        if (mInstance != null)
            return mInstance;

        mInstance = new CameraAccess(context);
        mInstance.connectIPCamera(uri);
        mInstance.isFixed = true;

        return mInstance;
    }

    private CameraAccess(Context context) {

        this.context = context;
        this.preferences = PreferenceManager
                .getDefaultSharedPreferences(context);
        this.preferences.registerOnSharedPreferenceChangeListener(this);
        this.cameraIndex = getCameraIndex();

        if (!OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_7, context,
                this)) {
            Log.e(TAG, "Cannot connect to OpenCVManager");
        } else
            Log.d(TAG, "OpenCVManager successfully connected");
    }

    public Context getContext() {
        return context;
    }

    public boolean isOpenCVLoaded() {
        return mOpenCVloaded;
    }

    @Override
    public void onManagerConnected(int status) {
        mOpenCVloaded = true;

        notifyOpenCVLoadedCallbacks();

        if (mCallbacks.size() > 0 && camera != null)
            camera.connect();
    }

    @Override
    public void onPackageInstall(int operation,
            InstallCallbackInterface callback) {
    }

    @Override
    public void onSharedPreferenceChanged(SharedPreferences sharedPreferences,
            String key) {

        String cameraSelectKey = context
                .getString(R.string.settings_select_camera_key);
        String cameraIPKey = context
                .getString(R.string.settings_camera_ip_address_key);

        if (key.equals(cameraIPKey) || key.equals(cameraSelectKey)) {
            this.preferences = sharedPreferences;
            this.cameraIndex = getCameraIndex();

            this.isDirty = true;
        }

    }

    private int getCameraIndex() {
        if (preferences == null || context == null)
            return CAM_NONE;

        String index = preferences.getString(
                context.getString(R.string.settings_select_camera_key), ""
                        + CAM_DEFAULT);

        this.cameraURI = preferences.getString(
                context.getString(R.string.settings_camera_ip_address_key),
                DEFAULT_IP);

        int intIndex;
        try {
            intIndex = Integer.parseInt(index);
            return intIndex;
        } catch (NumberFormatException ex) {
            Log.e(TAG, "Could not parse camera index: " + ex.getMessage());
            return CAM_NONE;
        }
    }

    public synchronized void addCallback(CameraFrameCallback callback) {

        if (callback == null) {
            Log.e(TAG, "Camera frame callback not added because it is null.");
            return;
        }

        // we don't care if the callback is already in the list
        this.mCallbacks.add(callback);

        Log.d(TAG, String.format("Camera frame callback added: %s (count: %d)",
                callback.getClass().getName(), this.mCallbacks.size()));

        if (camera != null) {
            if (camera.isConnected())
                callback.onCameraInitialized(mFrameWidth, mFrameHeight);
            else
                camera.connect();
        }
    }

    public synchronized void removeCallback(CameraFrameCallback callback) {

        synchronized (this) {
            if (callback == null) {
                Log.e(TAG,
                        "Camera frame callback not removed because it is null.");
                return;
            }

            boolean removed = false;
            do {
                // someone might have added the callback multiple times
                removed = this.mCallbacks.remove(callback);

                if (removed) {
                    callback.onCameraReleased();

                    Log.d(TAG, String.format(
                            "Camera frame callback removed: %s (count: %d)",
                            callback.getClass().getName(),
                            this.mCallbacks.size()));
                }

            } while (removed == true);
        }

        if (mCallbacks.size() == 0)
            releaseCamera();
    }

    public synchronized void addOpenCVLoadedCallback(
            IOpenCVLoadedCallback callback) {

        if (callback == null) {
            return;
        }

        if (mOpenCVloaded) {
            callback.onOpenCVLoaded();
            return;
        }

        // we don't care if the callback is already in the list
        this.mLoadedCallbacks.add(callback);
    }

    // private synchronized void removeOpenCvCallback(
    // IOpenCVLoadedCallback callback) {
    //
    // if (callback == null)
    // return;
    //
    // boolean removed = false;
    // do {
    // // someone might have added the callback multiple times
    // removed = this.mLoadedCallbacks.remove(callback);
    //
    // } while (removed == true);
    // }

    private synchronized void notifyOpenCVLoadedCallbacks() {
        if (!mOpenCVloaded)
            return;

        for (IOpenCVLoadedCallback callback : mLoadedCallbacks)
            callback.onOpenCVLoaded();

        mLoadedCallbacks.clear();
    }

    public boolean isAutomaticReceive() {
        return automaticReceive.isSet();
    }

    public void setAutomaticReceive(boolean automatic) {
        if (automatic)
            automaticReceive.set();
        else
            automaticReceive.reset();
    }

    public boolean hasRegionOfInterest() {
        return !this.roi.isEmpty() && camera != null
                && camera.supportsRegionOfInterest();
    }

    public Rect getRegionOfInterest() {
        return this.roi;
    }

    public void setRegionOfInterest(Rect roi) {
        if (roi == null)
            this.roi.set(0, 0, 0, 0);
        else
            this.roi.set(roi);
    }

    public void receive() {
        doReceive.set();
    }

    public boolean waitForReceive(long milliseconds) {
        try {
            return doReceive.waitOne(milliseconds);
        } catch (InterruptedException e) {
            return false;
        }
    }

    private void connectCamera() {
        Log.d(TAG, "connect to camera " + cameraIndex);
        if (cameraIndex == CAMERA_INDEX_IP) {
            connectIPCamera(null);
        } else {
            connectLocalCamera();
        }
    }

    private void connectLocalCamera() {
        camera = new HardwareCamera(context, this, cameraIndex);
    }

    private void connectIPCamera(String uri) {

        if (TextUtils.isEmpty(uri))
            uri = cameraURI;

        if (TextUtils.isEmpty(uri))
            throw new NullPointerException(
                    "No URI (IP) for the remote network camera specified.");

        // camera = new NetworkCameraOpenCV(this, uri);
        camera = new NetworkCameraCached(this, uri);
        // camera = new NetworkCamera(this, uri);
        Log.d(TAG, "Connected to network camera: " + uri);
    }

    private synchronized void releaseCamera() {

        if (camera != null) {
            camera.release();

            for (CameraFrameCallback callback : mCallbacks)
                callback.onCameraReleased();
        }
    }

    public synchronized void onPreviewFrame(CameraFrame frame) {
        for (CameraFrameCallback callback : mCallbacks) {
            callback.onFrameReceived(frame);
        }
    }

    public synchronized void onCameraInitialized(int width, int height) {
        this.mFrameWidth = width;
        this.mFrameHeight = height;

        for (CameraFrameCallback callback : mCallbacks) {
            callback.onCameraInitialized(width, height);
        }
    }

    public interface CameraFrameCallback {
        void onCameraInitialized(int frameWidth, int frameHeight);

        void onFrameReceived(CameraFrame frame);

        void onCameraReleased();
    }

    public interface IOpenCVLoadedCallback {
        void onOpenCVLoaded();
    }

    public interface ICamera {

        boolean supportsRegionOfInterest();

        void connect();

        void release();

        boolean isConnected();
    }
}

An implementation of a locally connected camera (same for Android Smartphones) is done by the HardwareCamera class. The member user can be seen as the consumer of the images that mediates between the camera and all UI components.

import java.io.IOException;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;

import org.opencv.android.Utils;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.Size;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;

public class HardwareCamera implements CameraAccess.ICamera,
        Camera.PreviewCallback {

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private static final boolean USE_THREAD = true;

    private final static String TAG = "HardwareCamera";
    // private final Context context;
    private final int cameraIndex; // example: CameraInfo.CAMERA_FACING_FRONT or
                                    // -1 for
    // IP_CAM
    private final CameraAccess user;
    private Camera mCamera;
    private int mFrameWidth;
    private int mFrameHeight;
    private CameraAccessFrame mCameraFrame;
    private CameraHandlerThread mThread = null;
    private SurfaceTexture texture = new SurfaceTexture(0);

    // needed to avoid OpenCV error:
    // "queueBuffer: BufferQueue has been abandoned!"
    private byte[] mBuffer;

    public HardwareCamera(Context context, CameraAccess user, int cameraIndex) {
        // this.context = context;
        this.cameraIndex = cameraIndex;
        this.user = user;
    }

    // private boolean checkCameraHardware() {
    // if (context.getPackageManager().hasSystemFeature(
    // PackageManager.FEATURE_CAMERA)) {
    // // this device has a camera
    // return true;
    // } else {
    // // no camera on this device
    // return false;
    // }
    // }

    public static Camera getCameraInstance(int facing) {

        Camera c = null;
        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
        int cameraCount = Camera.getNumberOfCameras();
        int index = -1;

        for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
            Camera.getCameraInfo(camIdx, cameraInfo);
            if (cameraInfo.facing == facing) {
                try {
                    c = Camera.open(camIdx);
                    index = camIdx;
                    break;
                } catch (RuntimeException e) {
                    Log.e(TAG,
                            String.format(
                                    "Camera is not available (in use or does not exist). Facing: %s Index: %s Error: %s",
                                    facing, camIdx, e.getMessage()));

                    continue;
                }
            }
        }

        if (c != null)
            Log.d(TAG, String.format("Camera opened. Facing: %s Index: %s",
                    facing, index));
        else
            Log.e(TAG, "Could not find any camera matching facing: " + facing);

        // returns null if camera is unavailable
        return c;
    }

    private synchronized void connectLocalCamera() {
        if (!user.isOpenCVLoaded())
            return;

        if (USE_THREAD) {
            if (mThread == null) {
                mThread = new CameraHandlerThread(this);
            }

            synchronized (mThread) {
                mThread.openCamera();
            }
        } else {
            oldConnectCamera();
        }

        user.onCameraInitialized(mFrameWidth, mFrameHeight);
    }

    private/* synchronized */void oldConnectCamera() {
        // synchronized (this) {
        if (true) {// checkCameraHardware()) {
            mCamera = getCameraInstance(cameraIndex);
            if (mCamera == null)
                return;

            Parameters params = mCamera.getParameters();
            List<Camera.Size> sizes = params.getSupportedPreviewSizes();

            // Camera.Size previewSize = sizes.get(0);
            Collections.sort(sizes, new PreviewSizeComparer());
            Camera.Size previewSize = null;
            for (Camera.Size s : sizes) {
                if (s == null)
                    break;

                previewSize = s;
            }

            // List<Integer> formats = params.getSupportedPictureFormats();
            // params.setPreviewFormat(ImageFormat.NV21);

            params.setPreviewSize(previewSize.width, previewSize.height);
            mCamera.setParameters(params);

            params = mCamera.getParameters();

            mFrameWidth = params.getPreviewSize().width;
            mFrameHeight = params.getPreviewSize().height;

            int size = mFrameWidth * mFrameHeight;
            size = size
                    * ImageFormat.getBitsPerPixel(params.getPreviewFormat())
                    / 8;

            this.mBuffer = new byte[size];
            Log.d(TAG, "Created callback buffer of size (bytes): " + size);

            Mat mFrame = new Mat(mFrameHeight + (mFrameHeight / 2),
                    mFrameWidth, CvType.CV_8UC1);
            mCameraFrame = new CameraAccessFrame(mFrame, mFrameWidth,
                    mFrameHeight);

            if (this.texture != null)
                this.texture.release();

            this.texture = new SurfaceTexture(0);

            try {
                mCamera.setPreviewTexture(texture);
                mCamera.addCallbackBuffer(mBuffer);
                mCamera.setPreviewCallbackWithBuffer(this);
                mCamera.startPreview();

                Log.d(TAG,
                        String.format(
                                "Camera preview started with %sx%s. Rendering to SurfaceTexture dummy while receiving preview frames.",
                                mFrameWidth, mFrameHeight));
            } catch (Exception e) {
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
        // }
    }

    @Override
    public synchronized void onPreviewFrame(byte[] frame, Camera arg1) {
        mCameraFrame.put(frame);

        if (user.isAutomaticReceive() || user.waitForReceive(500))
            user.onPreviewFrame(mCameraFrame);

        if (mCamera != null)
            mCamera.addCallbackBuffer(mBuffer);
    }

    private class CameraAccessFrame implements CameraFrame {
        private Mat mYuvFrameData;
        private Mat mRgba;
        private int mWidth;
        private int mHeight;
        private Bitmap mCachedBitmap;
        private boolean mRgbaConverted;
        private boolean mBitmapConverted;

        @Override
        public Mat gray() {
            return mYuvFrameData.submat(0, mHeight, 0, mWidth);
        }

        @Override
        public Mat rgba() {
            if (!mRgbaConverted) {
                Imgproc.cvtColor(mYuvFrameData, mRgba,
                        Imgproc.COLOR_YUV2BGR_NV12, 4);
                mRgbaConverted = true;
            }
            return mRgba;
        }

        // @Override
        // public Mat yuv() {
        // return mYuvFrameData;
        // }

        @Override
        public synchronized Bitmap toBitmap() {
            if (mBitmapConverted)
                return mCachedBitmap;

            Mat rgba = this.rgba();
            Utils.matToBitmap(rgba, mCachedBitmap);

            mBitmapConverted = true;
            return mCachedBitmap;
        }

        public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
            super();
            mWidth = width;
            mHeight = height;
            mYuvFrameData = Yuv420sp;
            mRgba = new Mat();

            this.mCachedBitmap = Bitmap.createBitmap(width, height,
                    Bitmap.Config.ARGB_8888);
        }

        public synchronized void put(byte[] frame) {
            mYuvFrameData.put(0, 0, frame);
            invalidate();
        }

        public void release() {
            mRgba.release();
            mCachedBitmap.recycle();
        }

        public void invalidate() {
            mRgbaConverted = false;
            mBitmapConverted = false;
        }
    };

    private class PreviewSizeComparer implements Comparator<Camera.Size> {
        @Override
        public int compare(Size arg0, Size arg1) {
            if (arg0 != null && arg1 == null)
                return -1;
            if (arg0 == null && arg1 != null)
                return 1;

            if (arg0.width < arg1.width)
                return -1;
            else if (arg0.width > arg1.width)
                return 1;
            else
                return 0;
        }
    }

    private static class CameraHandlerThread extends HandlerThread {
        Handler mHandler;
        HardwareCamera owner;

        CameraHandlerThread(HardwareCamera owner) {
            super("CameraHandlerThread");

            this.owner = owner;

            start();
            mHandler = new Handler(getLooper());
        }

        synchronized void notifyCameraOpened() {
            notify();
        }

        void openCamera() {
            mHandler.post(new Runnable() {
                @Override
                public void run() {
                    owner.oldConnectCamera();
                    notifyCameraOpened();
                }
            });

            try {
                wait();
            } catch (InterruptedException e) {
                Log.w(TAG, "wait was interrupted");
            }
        }
    }

    @Override
    public boolean supportsRegionOfInterest() {
        return false;
    }

    @Override
    public void connect() {
        connectLocalCamera();
    }

    @Override
    public void release() {
        synchronized (this) {

            if (USE_THREAD) {
                if (mThread != null) {
                    mThread.interrupt();
                    mThread = null;
                }
            }

            if (mCamera != null) {
                mCamera.stopPreview();
                mCamera.setPreviewCallback(null);
                try {
                    mCamera.setPreviewTexture(null);
                } catch (IOException e) {
                    Log.e(TAG, "Could not release preview-texture from camera.");
                }

                mCamera.release();

                Log.d(TAG, "Preview stopped and camera released");
            }
            mCamera = null;

            if (mCameraFrame != null) {
                mCameraFrame.release();
            }

            if (texture != null)
                texture.release();
        }
    }

    @Override
    public boolean isConnected() {
        return mCamera != null;
    }
}

The final step is linking it all together. This is done once your the implementation of your activity in the onResume method.

@Override
protected void onResume() {
    super.onResume();

    if (fourPointView != null) {
        cameraAccess = CameraAccess.getInstance(this);
        canvasView.setCamera(cameraAccess, true);
    } else {
        cameraAccess = null;
    }

    if (cameraAccess != null)
        cameraAccess.setAutomaticReceive(true);

    if (cameraAccess != null && fourPointView != null)
        cameraAccess.setRegionOfInterest(RectTools.toRect(canvasView
                .getCamera().getViewport()));
}

@Override
protected void onPause() {
    super.onPause();

    if (cameraAccess != null)
        cameraAccess.setRegionOfInterest(null);
}

Remark: I know it's not a full implementation, but I hope you get the point. The most interesting part is the color conversion anyway and this is found on the top of this posting.

Matthias
  • 5,574
  • 8
  • 61
  • 121
  • hi matthias, any chance you found a way to do this without importing a lot of megabytes for the opencv stuff? – Rafael Sanches Feb 24 '18 at 23:46
  • It’s too long ago. But I bet that there are solutions that work without OpenCV. You may also have a look into the source code of the OpenCV function that I’m using. Just get the code that you need. – Matthias Feb 25 '18 at 16:33