8

Trying to get camera working from native code in Android ICS: most manuals refer to startPreview() method . But browsing AOSP code I've found also 'startRecording()' method in <Camera.h>. Here said that it is from interface ICameraRecordingProxy "that allows the recorder to receive video frames during recording"

So the question is - in terms of performance is 'startRecording' approach more efficient than 'startPreview'?

The only one goal of going into native code is performace, Java 'Camera' is too slow, and OpenCV does not provide required level of FPS as well..

EDIT: target platform is: API level=17, device Allwinner A31 development board, 1280x720x30FPS. The task is to capture frames from camera, modify them, encode (H264) and store to SD card. pure java MediaRecorder write mp4 file with 1280x720x30. Show live preview on screen is not needed.

OpenCV-demo1 in native mode gives 1920x1080x2 (same in java mode). Simple java approach with empty PreviewCallback maximal FPS is 15.

Thank you in advance..

Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
Sergii
  • 391
  • 6
  • 16
  • Grafika can record from the camera preview -- works fine at 30fps on a 2012 Nexus 7 with nothing but Java code. (It does require API 16+ for `MediaCodec` though.) See "Show + capture camera" in https://github.com/google/grafika . If you're primarily interested in straight recording, perhaps what you want is `MediaRecorder` (http://developer.android.com/reference/android/media/MediaRecorder.html)? – fadden Feb 17 '14 at 16:22
  • thank you for a link, but Grafika "An SDK app, developed for API 18 (Android 4.3)" and my target SDK is 17 :( tried to build and start - no luck. MediaRecorder is nice and works well, but I need to modify frames on fly – Sergii Feb 17 '14 at 17:45
  • Some of what Grafika does will work in API 16. What you don't get until API 18 is `Surface` input to `MediaCodec`, which unfortunately is going to be important for performance in what you're doing. Depending on what sort of frame modifications you're planning to do, the stuff API 18 gives you can help in other ways... a demo of Grafika employing some simple GPU-shader-based image filters can be found here: http://www.youtube.com/watch?v=kH9kCP2T5Gg – fadden Feb 17 '14 at 17:59
  • thanks a lot, 'grafika' is very interesting, butthe problem is target device can not be upgraded to API 18 :( – Sergii Feb 17 '14 at 18:04
  • Targeting older APIs is pretty common, usually for market share reasons. You should mention the target API in the question, and also give a sense for what sort of modifications you're planning to make to the frames (i.e. simple frame insertion / removal, fancy image filtering, whether you want to work in YUV or RGB space, etc.). Do you need to show a preview on-screen at the same time you're recording? You said you're not hitting your target FPS rate; what rate do you need, and at what resolution? What's your target device? – fadden Feb 17 '14 at 18:11
  • thank you. just modified initial post with 'EDIT' mark. target platform is Allwinner A31 dev. board. – Sergii Feb 17 '14 at 19:17

3 Answers3

1

In terms of performance there is no gain in going for native camera. Using Camera.setPreviewCallbackWithBuffer() in Java (off UI thread) gives as many frames per second as any native alternative. But on some SOCs, e.g. Samsung, camera output may be directly (0-copy) wired with HW h264 encoder, which naturally gives excellent throughput. This is what the <quote>pure java MediaRecorder</quote> does under the hood. You cannot achieve same if any manipulation of the buffer should be involved.

Community
  • 1
  • 1
Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
  • why? according to http://stackoverflow.com/questions/10670953/how-do-i-get-the-raw-android-camera-buffer-in-c-using-jni frame data copied at least few times before they come to java world. so any java-access of frames should be slower than native. I was thinking of taking MediaRecorder sources from AOSP and modifying it to get work done with FPS lightly lower than MediaRecorder provides.. should it work? – Sergii Feb 17 '14 at 22:24
  • This explanation is not convincing. What they count as 2 memcpy's is usually only passing the pointers, switching virtual address etc. I should repeat that direct wiring of the camera to encoder within same system process does give the MediaRecoder some advantage. With A31 CPU and if you use 2 slice threads for ultra-fast x264 encoding, you should be able to achieve 720p at 15 FPS or more. – Alex Cohn Feb 17 '14 at 23:01
  • Modifying AOSP MediaRecorder will not help, because the actual heavy lifting happens in the system Media service, and you will immediately loose all advantage when you try to modify the data. – Alex Cohn Feb 17 '14 at 23:06
  • do you think it is possible in principle (java/native) reach 1280x720x30 FPS capturing, watermarking, encoding and storing video to SD card? – Sergii Feb 17 '14 at 23:27
  • The best results that I know with x264 on a modern quad-core device (Samsung Note 10) were about half that throughput: we could reach 800x480x30. With HW encoder (via stagefright), 1280x720x30 is possible, but I don't have first-hand experience with A31. – Alex Cohn Feb 18 '14 at 08:13
  • you mean best results you reached wer by using onPreviewCallbackWithBuffer, then going to stagefright by JNI? – Sergii Feb 18 '14 at 13:31
  • My quote is for software encoder; with _onPreviewCallbackWithBuffer, then going to stagefright by JNI_, it should be possible to achieve 720p @30 fps, even on modest CPU. – Alex Cohn Feb 18 '14 at 13:56
  • probably it very device specific, I made first tests on A31 and onPreviewCallbackWithBuffer never get more than 15 FPS while first native-access testing gives 720p 30FPS – Sergii Feb 19 '14 at 08:33
  • You mean - empty callback for `onPreviewCallbackWithBuffer` vs. MediaRecorder? – Alex Cohn Feb 19 '14 at 08:42
  • I mean empty onPreviewCallbackWithBuffer() and empty dataCallbackTimestamp() after Camera.startRecording() in native code – Sergii Feb 19 '14 at 13:10
  • Too bad… Have you checked that the callback does not happen on the UI thread? As I explained, the system may use short-circuit to record video with HW encoder, so that it all happens without IPC. – Alex Cohn Feb 19 '14 at 16:05
1

to close topic: I was able to reach 1280x720 with FPS=30 using native access to camera and use hardware H264 encoder. Also can modify (watermark) data on the fly keeping FPS high. None of other approaches - any JAVA or OpenCV could give more that 15 FPS (may be I did not try hard..)

startRecording() works perfectly

thank you for comments

Sergii
  • 391
  • 6
  • 16
  • are you using the native allwinner encoder c code? something like AWcodecTest.cpp? http://dl.linux-sunxi.org/SDK/A23-v1.0/unpacked/A23/android/hardware/aw/AWcodecTest/ – huisinro Apr 12 '14 at 19:12
  • no, I used part of AOSP: #include and – Sergii Apr 18 '14 at 11:14
  • can you provide some sample code? Highly appreciate it. – huisinro Apr 18 '14 at 16:50
  • I used AOSP as reference with following sequence of calls: sp m_camera = = Camera::connect(0); m_camera->setPreviewDisplay(surface); m_camera->setParameters(params); m_camera->setRecordingProxyListener(listener); m_camera->startRecording(); – Sergii Apr 22 '14 at 06:35
  • 6
    It would be very helpful if you could share a sample project of it. – Reaz Murshed May 19 '14 at 10:35
  • 1
    showing the code would help lots! that's what SO is for after all. –  Dec 03 '14 at 05:56
  • camera API in AOSP changing from version to version. I worked with api-level=17 and all code refers to it. Since project is comercial - I can not share sources directly. I can only describes steps to reach it: – Sergii Mar 01 '15 at 19:59
  • for example to access raw YUV data from camera in NDK: create a class implementing interface **ICameraRecordingProxyListener** and implement method **dataCallbackTimestamp(nsecs_t timestamp, int32_t msgType, const sp& data)** this method will receive frames from camera. For example to make a copy of whole frame call: **memcpy(yuvBuffer, (const byte*)data.get()->pointer(), data.get()->size() );** – Sergii Mar 01 '15 at 20:14
  • @user2199593 I'm curious how you used the AOSP code in you App. `camera/Camera.h` is not in NDK, and how did you link them? – jfly Dec 28 '15 at 01:24
  • that was quite long ago, but I guess I have downloaded sources of android kernel and have linked required files in Android.mk – Sergii Dec 28 '15 at 13:12
0
dataCallbackTimestamp(nsecs_t timestamp, int32_t msgType, const sp<IMemory>& data) 

when

msgType==CAMERA_MSG_VIDEO_FRAME

gives data in internal form. It is not guaranteed pure yuv frame data. For example data.get()->size() can be bigger than yuv frame size or can be some 20-bytes-sized data structure for real(?) frame buffer keeping somewhere in camera buffers list.

So, this topic is not complete yet. :)

Uri Agassi
  • 36,848
  • 14
  • 76
  • 93
Gost
  • 1
  • yes it happens, one of 3 my test devices gives proper yuv frame without strides (like in preview mode) and in this case all works fine. – Gost Mar 05 '15 at 09:07
  • Some cameras can providing yuv frames in streaming mode. Trying to set **camera->storeMetaDataInBuffers(false)**. When success, camera work fine, but one copy operation may occur (e.g. from GPU memory to HOST). There is no any universal cases to get yuv in streaming mode before Android API21. Some of metadata representation used only for hardware-to-hardware data transfers, and in this case only vendor know how to interpret them. It is resolved problem also, but justified only for particular cases, not in general. – Gost Mar 17 '15 at 06:19
  • Some metadata formats are simple. Look for **extractGrallocData** or **kMetadataBufferTypeGrallocSource** cases in libstagefright. – Gost Mar 17 '15 at 06:32