0

I have an android application and I've integrated my own custom tensorflow lite model.

I can print out the object category and probability score but I'm at a loss on how to draw the bounding boxes of the object on the image that is uploaded.

Here is my code that I have where I set the image into a ImageView.

 // SET THE IMAGE
    @Override
    protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
        super.onActivityResult(requestCode, resultCode, data);

        if(requestCode == 100)
        {
            imageBox.setImageURI(data.getData());

            Uri uri = data.getData();
            try {
                img = MediaStore.Images.Media.getBitmap(this.getContentResolver(), uri);
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

This is where I am integrating the model.

predictBtn.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {

                img = Bitmap.createScaledBitmap(img, 128, 128, true);

                try {
                    Android model = Android.newInstance(getApplicationContext());

                    // Creates inputs for reference.
                    TensorImage image = TensorImage.fromBitmap(img);

                    // Runs model inference and gets result.
                    Android.Outputs outputs = model.process(image);
                    Android.DetectionResult detectionResult = outputs.getDetectionResultList().get(0);

                    // Gets result from DetectionResult.
                    float score = detectionResult.getScoreAsFloat();
                    RectF location = detectionResult.getLocationAsRectF();
                    String category = detectionResult.getCategoryAsString();

                    // Releases model resources if no longer used.
                    model.close();
                    // here we will print out the results of the object to text views based on the image that is inputted by the user
                    // we print out object type and its accuracy score
                    objecttv.setText(category);
                    scoretv.setText(Float.toString(score));
                } catch (IOException e) {
                    // TODO Handle the exception
                }

            }
        });

What do I need to do to make use of RectF location = detectionResult.getLocationAsRectF(); ? I am using a static image and not actually tracking the movement in realtime.

Dominik Teroerde
  • 309
  • 1
  • 11
mrOlympia
  • 311
  • 1
  • 2
  • 12

1 Answers1

1

Answer modified after question changed.
You can draw the bounding boxes with Canvas. Override onDraw() in the view where you want to draw the rectangle. Attention, this codes only works for one bounding box. Otherwise you have to implement a for loop. You should not call the canvas onDraw function directly from the activity. Here is a complete example:

Your tracker class:

// Init your canvas view
DrawView mCanvasView = (DrawView) findViewById(R.id.canvasView);

// ...

predictBtn.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {
            img = Bitmap.createScaledBitmap(img, 128, 128, true);
            try {
                Android model = Android.newInstance(getApplicationContext());
                // Creates inputs for reference.
                TensorImage image = TensorImage.fromBitmap(img);
                // Runs model inference and gets result.
                Android.Outputs outputs = model.process(image);
                Android.DetectionResult detectionResult = outputs.getDetectionResultList().get(0);
                // Gets result from DetectionResult.
                float score = detectionResult.getScoreAsFloat();
                RectF location = detectionResult.getLocationAsRectF();
                String category = detectionResult.getCategoryAsString();

                mCanvasView.drawBoundingBox()                 

                // Releases model resources if no longer used.                                   
                model.close();
                
                // here we will print out the results of the object to text views based on the image that is inputted by the user
                // we print out object type and its accuracy score
                objecttv.setText(category);
                scoretv.setText(Float.toString(score));
            } catch (IOException e) {
                // TODO Handle the exception
            }
        }
});

Implement this canvas view:

import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.view.View;

public class DrawView extends View {
   Paint boxPaint;
   public static Canvas mCanvas;

    public DrawView(Context context) {
        super(context);
        boxPaint = new Paint();            
    }

    public void drawBoundingBox() {
        // Refresh the view by calling onDraw function
        invalidate();
    }

    @Override
    public void onDraw(Canvas canvas, AttributeSet attrs) {
         // Draw what you want
         boxPaint.setColor(Color.RED);
         boxPaint.setAlpha(200);
         boxPaint.setStyle(Paint.Style.STROKE);
         canvas.drawRect(location, boxPaint);
    }
}

Define your Draw view in xml of your tracker class

<your.package.name.DrawView
    android:id="@+id/R.id.canvasView"
    android:layout_width="match_parent"
    android:layout_height="match_parent" />
Dominik Teroerde
  • 309
  • 1
  • 11