1

The goal is to get real-time image processing working on the live preview of an Android device. This has been tackled here on SO many times, and has some mentions on the web (see references at the end of the question), however, I haven't found a single suggestion on a proper multithreaded framework to accomplish this.

To my understanding, to make proper use of multithreading (and devices with more than one physical core) one would implement the following:

  1. The frame handler is called on the UI thread (onCameraPreview) and enqueues the frame data to a shared queue Q.
  2. A thread pool of n threads awaits on Q. Each thread dequeus a frame whenever available, does the "heavy" image processing and posts a message to the UI thread with the result.

My main question is whether the above is the right direction, and how exactly to accomplish it on Android. Also, what would be the required modifications if the image processing had to be done in serial - i.e. no more than 1 frame being processed at any given time but still have the processing done on a thread separate than the main one.


Update 1:
I ended up implementing this as an AsyncTask that recieves the raw frame data as input and uses the onPostExecute callback to update the UI. However, this happens about 10 times per second which makes me wonder whether it is not generating too much overhead. This is why, as far as I'm concerned, this question is still open to get a validation this is indeed the most efficient method. Also, it is still unclear to me how one would expand this approach to multiple worker threads on multiple cores.


Related questions:
- Video processing in Android
- Android Camera Frame processing with MultiThreading
- Android: How to record video and process its frames in real time?
- Processing Android camera frames in real time
- Ways of getting high FPS for real time computer vision processing
- Multithreading in openCV4Android
- Android: Real Time Image Processing
- Android video frame processing
- Simultaneous camera preview and processing
- Real Time Image Processing in Android using the NDK
- Android Camera Preview filtering
- How to show real time filtered camera preview while recording videos?

And links:
http://ibuzzlog.blogspot.co.il/2012/08/how-to-do-real-time-image-processing-in.html

Community
  • 1
  • 1
stav
  • 1,497
  • 2
  • 15
  • 40
  • I'm still searching for the answer to this question if anyone can help... I would be happy to edit/add information if someone finds the question lacking clarity. – stav Apr 07 '14 at 07:30
  • An SO question that has slipped my research and answers my question exactly is [here](http://stackoverflow.com/questions/15799487/parallel-image-detection-and-camera-preview-opencv-android). – stav Aug 11 '14 at 12:48
  • Stav, I'm trying something very similar and found your post. I also read the SO post that you said answered your question, but could not understand an important part of it so I asked him a question. Then realized you had surely solved this and since you've posted much more recently than he maybe you can help me. He wrote "separate each thread by a fixed delay". If cores process in parallel, why would this help? And how would you separate each thread by a fixed delay anyway? Once you start a thread doesn't it just run without regard for when it was spawned in relation to other threads? – Alyoshak Sep 25 '14 at 16:50
  • Well I haven't implemented it yet, and I think the real open question that isn't answered by @Rick77's answer is how *exactly* to implement the ThreadPool. But, per your question - the delaying would not help in terms of performance - but it might make more sense for most use cases. If your processing returns the user some meaningful answer about the current frame then you would want these answers evenly spread through time. This is the most efficient way to make the user feel like the processing is done all the time – stav Sep 25 '14 at 20:35

1 Answers1

2

Android documentation states that the call given to "onPreviewFrame()" with a byte[] array cannot be put in a queue or something because the array is reused after each call. The call has to be returned as soon as possible to attain better framerates.

So here is what you can do (I done it and got 29 FPS)

  1. Pass the array to a Native c or c++ library through JNI call

  2. Implement a queue in your C program and malloc a node to store this array

  3. Memory copy the array and return the call.

  4. And Return the call

  5. En queue the node for processing and de queue nodes to process with some encoders such as FFMPEG

  6. Run this Queuing process and Encoding process parallel so that after encoding a frame you can clear that from memory to prevent memory overflow.

  7. And remember to choose the optimal resolution so that your encoder matches the speed of input framerate so that there is no memory overflow due to queue.

  • Thanks for your answer, however, I'm looking to do this in Java only. The challenge being implementing a proper thread pool that works against a queue such as the one you described. Also, while I feel I know how the general framework should look like, I'm more interested in the implementation specifics - i.e. which threading tools to use etc. – stav Aug 03 '14 at 12:33