0

TL;DR

In Android Studio using the Java version of OpenNI, the following line of code deletes OpenNI frame data and I don't know how to put processed data back into the same VideoFrameRef or how to process it with OpenCV:

byte[] fb = frameBuffer.array();

where frameBuffer is of type ByteBuffer.

Can somebody show me how to do so, or how to simply compile OpenCV with OpenNI support so I can do all the initialization etc in C++ before sending processed data back to Java to be displayed?

Edit

The following code solved the missing data issue:

                    VideoMode currVidMode = frame.getVideoMode();
                    PixelFormat currPixelFormat = currVidMode.getPixelFormat();

                    ByteBuffer frameBuffer = frame.getData();
                    byte[] fB = new byte[currVidMode.getResolutionY()*currVidMode.getResolutionX()];
                    if(frameBuffer.hasArray()) {
                        fB = frameBuffer.array();
                    }

Now all that is left is to properly cast data and interface with OpenCV, though building OpenCV with NI support is still preferable if somebody knows how to do it on Android.

Problem Statement

I would like to have direct access to matrix data of images captured from OpenNI compliant devices (specifically, the Orbbec Astra Mini) in order to perform real time image processing on each frame before displaying on the screen. As I understand it, there are two ways to do this:

  1. EITHER Build OpenCV from source with OpenNI support and use its functions to capture Mat objects.
  2. OR include both libraries independently, initialize a camera, and copy the buffer over.

However, I am running into the following issues.

Issues with Approach 1

For solution type 1, I don't have experience using CMake and building libraries from source. Although I'm confident I could do this in my native x64 Windows environment, I am having a lot of trouble following tutorials like the one on OpenCV's site, and I'm a bit confused on how to use flags like "-D WITH_OPENNI=ON" in Android Studio (Where do I put OpenNI binaries for proper compiling? Do I have to use a special compiler for the ARM chip on my Galaxy Note 3? etc.) These are basic questions, but I'm inexperienced enough that I need a little bit of guidance nonetheless.

It seems like approach 1 is cleaner code, so it would be extremely helpful to me if somebody was able to help with that. But if not, approach 2 is probably easier.

Issues with Approach 2

I already have a working project which displays the frames from the OpenNI compatible sensor (Depth, RGB, IR) on the screen. It is adapted from the Android code from Orbbec, the manufacturer. The code for the reading and updating of the frames is below, and I'm sure the functions to read and update frames work.

while (mShouldRun) {
                VideoFrameRef frame = null;

                try {
                    OpenNI.waitForAnyStream(streams, 100);
                    //Get the frame
                    frame = mStream.readFrame();
                    //Process the frame
                    /*
                    ByteBuffer frameBuffer = frame.getData();

                    byte[] fB = frameBuffer.array();
                    */

                    // Request rendering of the current OpenNI frame
                    mFrameView.update(frame);

...

                } catch (TimeoutException e) {
                } catch (Exception e) {
                    Log.e(TAG, "Failed reading frame: " + e);
                }
            }

However, when the below line is un-commented, my android device no longer receives any frames.

byte[] fB = frameBuffer.array();

I need to be able to store these frames in order to use OpenCV on them but I can not seem to get it right, even after having done a fairly thorough internet search on the matter.

I'm sorry I can not post all my links as I am a new user, I cannot post more than the two above, but after a few days failing at this, I am turning to this community for help. Let me know if I can make my question any clearer/better.

  • Here is the link to the NIViewer.Android project I used as a template. The code fragment above is in the StreamViewer.java file. http://www.orbbec3d.net/Tools_SDK_OpenNI/1-Android.zip – Anastasios Nikolas Angelopoulo Aug 26 '17 at 06:57
  • Found this so I'm not sure if its related: https://github.com/occipital/OpenNI2/issues/43 Is your device on KitKat (4.4) - and rooted? – Morrison Chang Aug 26 '17 at 07:22
  • Thanks! Unfortunately, the error is not related to opening the device, but rather getting raw frame data as a matrix in order to process it. I am running Lollipop 5.0. – Anastasios Nikolas Angelopoulo Aug 26 '17 at 07:32
  • I reading the source from your link..if your commented out code is causing the issue it may be that you aren't allocating space for the ByteBuffer you need for processing: https://stackoverflow.com/questions/4841340/what-is-the-use-of-bytebuffer-in-java Although I have concern about OpenCV being able to keep up with the frames provided. – Morrison Chang Aug 26 '17 at 07:59
  • Thank you. I actually re-compiled the code and now the original ByteBuffer framebuffer = frame.getData() works, and after reading framebuffer into the byte[] array no frames are processed by the device. I will edit my question to reflect this. – Anastasios Nikolas Angelopoulo Aug 26 '17 at 08:28
  • And now the byte[] array works properly, but I still have to interface it with OpenCV. – Anastasios Nikolas Angelopoulo Aug 26 '17 at 08:45

0 Answers0