2

I'm developing a Flutter plugin for Android and iOS using the Flutter plugin camera. First I stream the frame from the plugin using this method:

controller.startImageStream((CameraImage img) {
     runlOnFrame(
                bytesList: img.planes.map((plane) {
                  return plane.bytes;
                }).toList(),
                imageHeight: img.height,
                imageWidth: img.width,
              )
}

The runlOnFrame method is developed in Java and it does the conversion from YUV bytes to bitmap in RGBA format like this:

Get Argument Function :

getArgs(HashMap args){
     List<byte[]> bytesList = (ArrayList) args.get("bytesList");
     int imageHeight = (int) (args.get("imageHeight"));
     int imageWidth = (int) (args.get("imageWidth"));
}

runOnFrame(List<byte[]> bytesList, int imageHeight, int imageWidth, ) throws IOException {
    int rotation = 90;
    ByteBuffer Y = ByteBuffer.wrap(bytesList.get(0));
    ByteBuffer U = ByteBuffer.wrap(bytesList.get(1));
    ByteBuffer V = ByteBuffer.wrap(bytesList.get(2));

    int Yb = Y.remaining();
    int Ub = U.remaining();
    int Vb = V.remaining();

    byte[] data = new byte[Yb + Ub + Vb];

    Y.get(data, 0, Yb);
    V.get(data, Yb, Vb);
    U.get(data, Yb + Vb, Ub);

    Bitmap bitmapRaw = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ARGB_8888);
    Allocation bmData = renderScriptNV21ToRGBA888(
        mRegistrar.context(),
        imageWidth,
        imageHeight,
        data);
    bmData.copyTo(bitmapRaw);

    Matrix matrix = new Matrix();
    matrix.postRotate(rotation);
    Bitmap finalbitmapRaw = Bitmap.createBitmap(bitmapRaw, 0, 0, bitmapRaw.getWidth(), bitmapRaw.getHeight(), matrix, true);
    saveBitm(finalbitmapRaw); // Function to save converted bitmap
    
  } 
  public Allocation renderScriptNV21ToRGBA888(Context context, int width, int height, byte[] nv21) {
    // https://stackoverflow.com/a/36409748
    RenderScript rs = RenderScript.create(context);
    ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));

    Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(nv21.length);
    Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);

    Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(width).setY(height);
    Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);

    in.copyFrom(nv21);

    yuvToRgbIntrinsic.setInput(in);
    yuvToRgbIntrinsic.forEach(out);
    return out;
  }

The problem is i'm getting green bitmap images like this:

Omatt
  • 8,564
  • 2
  • 42
  • 144
Raid Lafi
  • 91
  • 1
  • 12
  • How do you get the 3 byte arrays for yuv? – Alex Cohn Sep 03 '19 at 21:26
  • in the controller.startImageStream i get the 3 plan in a List, please look at controller.startImageStream – Raid Lafi Sep 03 '19 at 21:30
  • The question is *how* the byte buffers that are produced by camera2 ImageReader get converted to byte arrays that you get in controller.startImageStream(). – Alex Cohn Sep 04 '19 at 04:08
  • I updated the code , added the getArgs function that convert to byte array – Raid Lafi Sep 04 '19 at 13:22
  • It seems that Plane bytes is not a **byte[]**, but rather **Uint8List**, see https://github.com/flutter/flutter/issues/26348. You must use `bytesPerPixel` that is defined for each Plane, and skip the irrelevant bytes to produce the byte[] that goes into `yuvToRgbIntrinsic`. – Alex Cohn Sep 04 '19 at 15:12
  • can you please explain to me how to produce the byte[] that goes into yuvToRgbIntrinsic – Raid Lafi Sep 04 '19 at 20:24
  • Yes I can, but first, you must find your code that converts Uint8List to byte[]. Or maybe, runOnFrame() never gets byte[], but dart silently accepts Uint8List instead? – Alex Cohn Sep 05 '19 at 06:02
  • hey, did you fix this issue? Because I currently have the same problem, the result is greenish, broken. – Oğulcan Çelik Apr 15 '20 at 23:13

0 Answers0