0

I am trying to write an application to trigger the Android camera at a fixed given time interval. I was testing it with TimerTask, however I read that I am not suppose to trigger the camera again until the JPEG is ready. Is there a method of triggering the camera at a fixed interval and letting the JPEG come when its ready and then trigger it again and let that next JPEG come when it's read, etc, without causing some sort of Heap Overflow? Is there a way to do this camera2?

Here are the relevant methods I have so far:

PictureCallback onPicTake=new PictureCallback() {


    @Override
    public void onPictureTaken ( byte[] bytes, Camera camera){
        Log.d("data size",""+bytes.length);
        Log.d("taken", "taken");
        new SaveImageTask(getStorage()).execute(bytes);
        resetCam();

    }
};

Camera.ShutterCallback onShutter=new Camera.ShutterCallback()

{
    @Override
    public void onShutter () {
        AudioManager mgr = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
        mgr.playSoundEffect(AudioManager.FLAG_PLAY_SOUND);

    }
};

private class CameraTrigger extends TimerTask{

      public void run(){

          mCamera.takePicture(onShutter, null, onPicTake);

      }



}
preview.setOnClickListener(new View.OnClickListener() {

        @Override
        public void onClick(View arg0) {

            timer = new Timer();
            timer.schedule(new CameraTrigger(), 0, 1000);
        }
    });
private void resetCam() {
    mCamera.startPreview();
    preview.setCamera(mCamera);
}
fadden
  • 51,356
  • 5
  • 116
  • 166
dylan7
  • 803
  • 1
  • 10
  • 22

1 Answers1

1

There is nothing terribly wrong in your code, as long as you know for sure that onPictureTaken() will not take more than 1000 ms.

One optimization that I would suggest is counterintuitively not to save the picture in a background task, but rather do it on the callback thread.

The reason is that the huge memory chunk of bytes cannot be easily garbage collected this way. From the point of view of JVM, the following pattern does not put a burden on garbage collector:

  • byte[] bytes = new byte[1Mb];
  • fill bytes with something
  • onPreviewFrame(bytes);
  • nobody needs bytes again
  • bytes memory is reclaimed

But if there are outstanding references to bytes, it may be hard for GC to decide, and you can see spikes of CPU usage, app not responding, and eventually, even TimeTask callbacks delayed.

Note that it is not healthy to use onPictureTaken() on the main (UI) thread. To keep the camera callbacks in background, you need to open the camera on a secondary Looper thread (see this example).

Community
  • 1
  • 1
Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
  • Thank you. Since the time is variable as user input, it may drop below the time onPictureTaken takes or be any time as input. Actuay I tested it on 1000ms, onPictureTaken takes more time. Basically I don't need onPictureTaken to finish in time for the next pic I just need to be allowed to take another pic while onPictureTaken is finishing and continuously queue them up. Does Camera2 allow this? – dylan7 Nov 19 '15 at 16:04
  • Your goal seems impossible: if it takes 1.5 sec to process one picture, then after 10 sec you will have a "debt" of 3 full pictures, all in memory. It will quickly exhaust the [limits of the OS](http://developer.android.com/training/articles/memory.html#RestrictingMemory) – Alex Cohn Nov 19 '15 at 19:20
  • Ok so I have to set some sort of minimum constraint on the user. Thank you. I am trying to put this up on a UAV so I don't need to see the surface viewer just the result stored in memory, it seems the camera library requires a viewing surface to be there. Is there a way to remove this, or is it not a lot of processing to have running ? – dylan7 Nov 19 '15 at 19:37
  • 1
    Android Camera requirement is that preview stream should be displayed to the end user. This is not (and cannot be) fully enforced, but different devices and systems require different workarounds. At any rate, live preview display uses dedicated hardware and has almost no effect on CPU load. – Alex Cohn Nov 20 '15 at 19:24
  • BTW, if you don't need very high resolution images, you may be better served if you use **onPreviewFrame()** callbacks instead of **onPictureTaken()** callbacks. This means that instead of Jpeg buffers you will receive YUV buffers (support for other formats is sparse; the new **camera2** API may deliver raw Bayer RGB images and more). – Alex Cohn Nov 20 '15 at 19:28