0

I try to understand the "Getting Started with CameraX" tutorial in particular the image analysis.

typealias LumaListener = (luma: Double) -> Unit

class LuminosityAnalyzer(private val listener: LumaListener) : ImageAnalysis.Analyzer {

   private fun ByteBuffer.toByteArray(): ByteArray {
       rewind()    // Rewind the buffer to zero
       val data = ByteArray(remaining())
       get(data)   // Copy the buffer into a byte array
       return data // Return the byte array
   }

   override fun analyze(image: ImageProxy) {

       val buffer = image.planes[0].buffer
       val data = buffer.toByteArray()
       val pixels = data.map { it.toInt() and 0xFF }
       val luma = pixels.average()

       listener(luma)

       image.close()
   }
}

I have compiled the example from step "6. Implement ImageAnalysis use case" and I think it works as expected.

This is some of my log data:

19:26:08.152  D  Average luminosity: 105.57404947916666
19:26:08.190  D  Average luminosity: 105.57047526041667
19:26:08.240  D  Average luminosity: 105.60593424479167
19:26:08.273  D  Average luminosity: 105.60776041666666
19:26:08.305  D  Average luminosity: 105.60956380208333
19:26:08.346  D  Average luminosity: 105.60956380208333
19:26:08.388  D  Average luminosity: 105.61209635416667
19:26:08.431  D  Average luminosity: 105.61344075520833

Every 30 to 50 ms a frame gets analyzed. It works fine but eats up all CPU time.

How to put back-pressure on the camera to slow down the analyzing?

Update: It is possible to configure the back-pressure strategy, but this is not sufficient. The function seems to be a misnomer, because the pressure never reaches the camera. The back-pressure strategy specifies, what happens, if the camera produces frames faster than the analyzer can analyze them. In this case you have two options: queue the overflow frames or throw them away, which is the default. But this has no impact on the camera, because the camera still produces too many frames. The CPU is still highly utilized with the task to generate frames, which are just thrown away. This will happen, if I call sleep at the end of the analyze function.

I am looking for a way to tell the camera to produce fewer frames.

ceving
  • 21,900
  • 13
  • 104
  • 178
  • 1
    Have you looked into the backpressure configuration in the builder? https://developer.android.com/reference/kotlin/androidx/camera/core/ImageAnalysis.Builder#setBackpressureStrategy(int) – cactustictacs Nov 04 '22 at 00:45
  • @cactustictacs See my update of the question. – ceving Nov 04 '22 at 08:03
  • You could have a look at this thread, where people are trying to increase the framerate for analysis (only you'd lower it of course): https://stackoverflow.com/q/57485050/13598222 I feel like, by default, the camera is going to try to hit some target framerate for its preview so it looks responsive to the user. And adding an image analyser isn't supposed to directly impact that framerate, it's meant to be a (potentially slow) async operation that runs alongside the preview - that's why there's a backpressure setting that has no influence on the producer, which emits frames at its own pace – cactustictacs Nov 04 '22 at 08:46

1 Answers1

1

Currently, there isn't a convenient way to set the FPS for ImageAnalysis, if that's what you are looking for. However, you can achieve the same result by dropping frames that you don't need.

Borrowing your code sample, if you want to rate-limit the analysis to be 10 FPS, you could do:

class LuminosityAnalyzer(private val listener: LumaListener) : ImageAnalysis.Analyzer {
   var latestAnalyzedTimestamp = 0L;
   val maxFps = 10; 

   private fun ByteBuffer.toByteArray(): ByteArray {
       rewind()    // Rewind the buffer to zero
       val data = ByteArray(remaining())
       get(data)   // Copy the buffer into a byte array
       return data // Return the byte array
   }

   override fun analyze(image: ImageProxy) {
       if (image.imageInfo.timestamp - latestAnalyzedTimestamp < 1000/maxFps) {
           // Drop frame to lower FPS.
           image.close()
       }

       val buffer = image.planes[0].buffer
       val data = buffer.toByteArray()
       val pixels = data.map { it.toInt() and 0xFF }
       val luma = pixels.average()

       listener(luma)

       image.close()
       latestAnalyzedTimestamp = image.imageInfo.timestamp
   }
}
Xi 张熹
  • 10,492
  • 18
  • 58
  • 86
  • I don't think ImageProxy holds the correct timestamp. On adding logs I found it had 2167742952710751 then 2167742986007886 – Astha Garg Jan 10 '23 at 06:57
  • 1
    The timestamp comes from the value of CaptureResult.SENSOR_TIMESTAMP, which is based on SystemClock.elapsedRealtimeNanos(). With your number, (2167742986007886-2167742952710751)/1000000 = 33ms, which is 30FPS. But I guessed the Javadoc can be improved. – Xi 张熹 Jan 10 '23 at 15:35