1

Here is all my code (there isn't much so I figured I'd paste):

The thing is I basically copied all this code from a tutorial. Apart from the decodeYUV method which I got from here.

My phone screen resolution is 480x800.

The key method I'm having problems with is the onPictureTaken method with the stream of bytes.

The problem is when I take a picture. The bitmap looks like this:

enter image description here

public class HuntActivity extends AppCompatActivity implements SurfaceHolder.Callback{

    Camera camera;
    SurfaceView surfaceView;
    SurfaceHolder surfaceHolder;
    Camera.PictureCallback jpegCallback;
    ImageView sbut;

    @Override
    protected void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_hunt);
        sbut = (ImageView) findViewById(R.id.searchbut);
        surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
        surfaceHolder = surfaceView.getHolder();
        surfaceHolder.addCallback(this);
        surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

        jpegCallback = new Camera.PictureCallback() {
            public void onPictureTaken(byte[] data, Camera camera) {

                int[] rgbs = new int[480*800*3/2]; //buffer size
                decodeYUV(rgbs, data, 480, 800);
                Bitmap bitmap = Bitmap.createBitmap(rgbs, 480, 800, Bitmap.Config.ARGB_8888);
                sbut.setImageBitmap(bitmap);
                Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
                refreshCamera();
            }
        };
    }

    public void captureImage(View v) throws IOException {
        camera.takePicture(null, null, jpegCallback);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {

        try {camera = Camera.open();
        } catch (RuntimeException e) {
            System.err.println(e);
            return;
        }
        Camera.Parameters param;
        param = camera.getParameters();
        param.setPreviewSize(800, 480);
        camera.setDisplayOrientation(90);
        camera.setParameters(param);
        try {
            camera.setPreviewDisplay(surfaceHolder);
            camera.startPreview();
        } catch (Exception e) {
            System.err.println(e);
            return;
        }

    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        refreshCamera();
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        camera.stopPreview();
        camera.release();
        camera = null;
    }


    public void refreshCamera() {
        if (surfaceHolder.getSurface() == null) {return;}
        try {camera.stopPreview();
        } catch (Exception e){}
        try {
            camera.setPreviewDisplay(surfaceHolder);
            camera.startPreview();
        } catch (Exception e) {}
    }

...
...
Community
  • 1
  • 1
Greg Peckory
  • 7,700
  • 21
  • 67
  • 114
  • Does it make any difference if you use 800x480 for taken picture size similarly as you set preview size? Or is the image rotated? – harism Jan 23 '16 at 21:14
  • Thanks for the reply. But I've tried this unfortunately, It is the correct dimensions, given that its rotated 90 degrees. – Greg Peckory Jan 23 '16 at 21:34

1 Answers1

1

You implement a jpegCallback and then treat the data bytes as YUV encoded. No. The data bytes contain a jpg image. So treat them accordingly.

greenapps
  • 11,154
  • 2
  • 16
  • 19
  • Thanks for the answer. That `jpegCallback` just happens to be the variable name from the tutorial. It implements `Camera.PictureCallback`. It says the format of the data depends on the context of the callback and the `Camera.Parameters` settings. I believe they are referring to `setPictureFormat` method which I never call. I thought by default it is YUV? Am I wrong? – Greg Peckory Jan 24 '16 at 10:33
  • Oh I see what you mean. Are you talking about `camera.takePicture(null, null, jpegCallback` ? – Greg Peckory Jan 24 '16 at 10:47
  • If so, that makes total sense, but what can I do to fix it? – Greg Peckory Jan 24 '16 at 10:48
  • BitmapFactory can decode a jpg from a byte array to a Bitmap too. Anyhow letting convert BitmapFactory a jpg to a Bitmap has been published here hundreds of times. You should be able to find examples on this site. – greenapps Jan 24 '16 at 10:52
  • I'm more concerned with extracting the RGB values, not getting the Bitmap. But I'll do some research. – Greg Peckory Jan 24 '16 at 10:58
  • Then change the subject of your post as you have already confused me. – greenapps Jan 24 '16 at 10:59
  • Done. In my case I was testing the RGBs by creating a Bitmap. But apologies for the confusion. – Greg Peckory Jan 24 '16 at 11:01
  • '(null, null, jpegCallback)'. The second parameter is for a rawCallback. Untested: maybe you get rgbs there? – greenapps Jan 24 '16 at 11:04
  • I'll give it a shot. Thanks for all the help! – Greg Peckory Jan 24 '16 at 11:07
  • The Camera class is deprecated now. There is camera2. You should consider if you will still spend valuable time on the old class. – greenapps Jan 24 '16 at 11:11
  • At the moment I'm running on older operating systems so need older API. Apparently my best bet is to get the YUV values and convert to RGB. Clearly I wasn't getting correct YUV values. I'll have to see what I was doing wrong there. – Greg Peckory Jan 24 '16 at 11:19