0

I have been able to manipulate the byte array from my camera preview and change the bytes from yuvtorgb using renderscript. Now, i need to know how to render that rgb bytes back to the camera preview. I believe that it is not possible to update the camera preview. I will need some sort of overlay view on top of my textureview and then render the updated byte array on to the surface of the overlay view. I need some direction on how i should go about it. Can someone kindly advise me how i should go about it. This is what i tried so far.

Mainactivity.java

public class MainActivity extends Activity implements TextureView.SurfaceTextureListener, Camera.PreviewCallback {
    private byte[] FrameData = null;

    private Camera mCamera;
    private TextureView mTextureView;
    Context mContext;
    private Camera.Size previewSize;
    private ImageView mimageView;

    private SurfaceTexture mTexture;
    private int[] pixels;

    private int pwidth;
    private int pheight;
    Renderscript rv;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        mimageView = new ImageView(this);
        mTextureView = new TextureView(this);
        mTextureView.setSurfaceTextureListener(this);

        rv = new Renderscript();

        setContentView(mTextureView);
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {


        DisplayMetrics displayMetrics = this.getResources().getDisplayMetrics();
        int screenWidth = displayMetrics.widthPixels;
        int screenHeight = displayMetrics.heightPixels;

        int  mPreviewRate = Math.round((float)screenHeight / (float)screenWidth);
        mCamera = Camera.open();

        Camera.Size previewSize = mCamera.getParameters().getPreviewSize();

        mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                previewSize.width*mPreviewRate, previewSize.height*mPreviewRate, Gravity.CENTER));



        Camera.Parameters params = mCamera.getParameters();
        if (params.getSupportedFocusModes().contains(
                Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
            params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
        }
        if (params.isAutoExposureLockSupported()) {
            params.setAutoExposureLock(false);
        }
        mCamera.setParameters(params);




        pwidth = previewSize.width*mPreviewRate;
        pheight = previewSize.height*mPreviewRate;

        try {
            mCamera.setPreviewTexture(surface);
        } catch (IOException t) {
        }
        mCamera.setPreviewCallback(this);

        mCamera.startPreview();

        mTextureView.setAlpha(1.0f);
        mTextureView.setRotation(90.0f);


        pixels = new int[pwidth*pheight];



    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
        // Ignored, the Camera does all the work for us
    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        mCamera.stopPreview();
        mCamera.release();
        mCamera.setPreviewCallback(null);

        return true;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {
        // Update your view here!
        mCamera.setPreviewCallback(this);


    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {

        data = rv.convertYUV2RGB(MainActivity.this,data,pwidth,pheight);

     //  sendbytearraytondkforopencvprocessing(data);








    }

    static {
        Native.register(MainActivity.class, "native-lib");
    }
    public native void toGrayScale(int pixels[], int len);




}

Renderscript.java

 public byte[] convertYUV2RGB(Context c,byte[] YUVArray, int H, int W){//W: 1280, H: 720

       RenderScript rs = RenderScript.create(c);
        ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));

        Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(YUVArray.length);

        Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);

        Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(W).setY(H);

        Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);
        in.copyFrom(YUVArray);
        byte[] RGBOut = new byte[W * H * 4];


        yuvToRgbIntrinsic.setInput(in);
        yuvToRgbIntrinsic.forEach(out);

        out.copyTo(RGBOut);


        return RGBOut;
    }
Phantômaxx
  • 37,901
  • 21
  • 84
  • 115

1 Answers1

0

I believe that it is not possible to update the camera preview.

That's right.

I will need some sort of overlay view on top of my textureview and then render the updated byte array on to the surface of the overlay view.

You can see similar approach in OpenCV. Their Android demo displays RGB preview. They don't use renderscript, but that does not matter. Anyways, the frame rate (depends on the CPU power and camera resolution) is usually not smooth enough.

To achieve better performance, you can use the texture view, or, to be more exact, use the GPU by directly setting pixels (glTexSubImage2D) on an EGL texture. See an example.

But if you do that, you actually waste a lot of time on passing data between CPU and GPU: once with renderscript, and then again to set the texture. Instead, you can use OpenGL to set the YUV pixels directly, and employ a shader to convert them to RGB for display.

Wyck
  • 10,311
  • 6
  • 39
  • 60
Alex Cohn
  • 56,089
  • 9
  • 113
  • 307