14

what is the best choice for rendering video frames obtained from a decoder bundled into my app (FFmpeg, etc..) ?

I would naturally tend to choose OpenGL as mentioned in Android Video Player Using NDK, OpenGL ES, and FFmpeg.

But in OpenGL in Android for video display, a comment notes that OpenGL isn't the best method for rendering video.

What then? The jnigraphics native library? And a non-GL SurfaceView?

Please note that I would like to use a native API for rendering the frames, such as OpenGL or jnigraphics. But Java code for setting up a SurfaceView and such is ok.

PS: MediaPlayer is irrelevant here, I'm talking about decoding and displaying the frames by myself. I can't rely on the default Android codecs.

Community
  • 1
  • 1
olivierg
  • 10,200
  • 4
  • 30
  • 33
  • Any news about this? I need to play back a video within an OpenGL quad - i.e. to get the video into an OpenGL texture, frame by frame. Do I have to go that FFmpeg-way or is there a simpler solution? – j00hi Jul 03 '11 at 16:55
  • @j00hi: this is off-topic, the question is not "how to play a video with OpenGL". Please search/ask another question. – olivierg Jul 05 '11 at 09:59
  • It's been nearly 10 years since the last activity. Is there a way to reopen this questions? – Bato-Bair Tsyrenov Feb 21 '20 at 18:40

3 Answers3

13

I'm going to attempt to elaborate on and consolidate the answers here based on my own experiences.

Why openGL

When people think of rendering video with openGL, most are attempting to exploit the GPU to do color space conversion and alpha blending.

For instance converting YV12 video frames to RGB. Color space conversions like YV12 -> RGB require that you calculate the value of each pixel individually. Imagine for a frame of 1280 x 720 pixels how many operations this ends up being.

What I've just described is really what SIMD was made for - performing the same operation on multiple pieces of data in parallel. The GPU is a natural fit for color space conversion.

Why !openGL

The downside is the process by which you get texture data into the GPU. Consider that for each frame you have to Load the texture data into memory (CPU operation) and then you have to Copy this texture data into the GPU (CPU operation). It is this Load/Copy that can make using openGL slower than alternatives.

If you are playing low resolution videos then I suppose it's possible you won't see the speed difference because your CPU won't bottleneck. However, if you try with HD you will more than likely hit this bottleneck and notice a significant performance hit.

The way this bottleneck has been traditionally worked around is by using Pixel Buffer Objects (allocating GPU memory to store texture Loads). Unfortunately GLES2 does not have Pixel Buffer Objects.

Other Options

For the above reasons, many have chosen to use software-decoding combined with available CPU extensions like NEON for color space conversion. An implementation of YUV 2 RGB for NEON exists here. The means by which you draw the frames, SDL vs openGL should not matter for RGB since you are copying the same number of pixels in both cases.

You can determine if your target device supports NEON enhancements by running cat /proc/cpuinfo from adb shell and looking for NEON in the features output.

Error 454
  • 7,255
  • 2
  • 33
  • 48
3

I have gone down the FFmpeg/OpenGLES path before, and it's not very fun.

You might try porting ffplay.c from the FFmpeg project, which has been done before using an Android port of the SDL. That way you aren't building your decoder from scratch, and you won't have to deal with the idiosyncracies of AudioTrack, which is an audio API unique to Android.

In any case, it's a good idea to do as little NDK development as possible and rely on porting, since the ndk-gdb debugging experience is pretty lousy right now in my opinion.

That being said, I think OpenGLES performance is the least of your worries. I found the performance to be fine, although I admit I only tested on a few devices. The decoding itself is fairly intensive, and I wasn't able to do very aggressive buffering (from the SD card) while playing the video.

Matthew
  • 44,826
  • 10
  • 98
  • 87
  • Thanks for your answer. I'm actually comfortable with AudioTrack and the NDK. I have a popular app which use these extensively, with a lot of native code and audio optimizations. Building and using the FFmpeg libs is ok too, I'm not afraid of that. My question really is about video frame displaying. Is OpenGL the method which achieve the best FPS, given all the texture stuff? What about the RGB565 default OpenGL format? Do videos look good with that pixel format? – olivierg Apr 14 '11 at 16:55
  • Kudos to you, sir! AudioTrack gave me fits in combination with the NDK. I used RGB565 and I didn't notice a difference in the video quality. As before, for my use it was the decoding itself that tended to lower the framerate, which never dropped below ~24 fps. – Matthew Apr 14 '11 at 16:58
  • Okay, yes I've actually already rendered very large JPEG images as textures in OpenGL RGB565, and it looked ok, but I was under the impression that using RGB888 would be better (see [my other question](http://stackoverflow.com/questions/5666287/is-it-possible-to-render-in-rgb888-with-opengl)), hence the reason for thinking about jnigraphics. I'm not sure yet, but I'm familiar with OpenGL so I think I'll go that route unless someone comes out with a better idea. – olivierg Apr 14 '11 at 17:04
  • Oh, and by the way, I have rephrased the question, to focus on the frame rendering aspect. The decoder aspect is marginal. Sorry for confusion. – olivierg Apr 14 '11 at 17:09
2

Actually I have deployed a custom video player system and almost all of my work was done on the NDK side. We are getting full frame video 720P and above including our custom DRM system. OpenGL is not your answer as on Android Pixbuffers are not supported, so you are bascially blasting your textures every frame and that screws up OpenGLESs caching system. You frankly need to shove the video frames through the Native supported Bitmap on Froyo and above. Before Froyo your hosed. I also wrote a lot of NEON intrinsics for color conversion, rescaling, etc to increase throughput. I can push 50-60 frames through this model on HD Video.

Greg
  • 89
  • 2
  • 1
    Thank you Greg for these informations. By "Native supported Bitmap" I assume you are talking about the jnigraphics library exposed by the NDK. Do you need to refresh/redraw the Bitmap/View in Java after changing pixels in native code? Have you tested this on many devices? Is the framerate consistent accross devices? – olivierg Jun 28 '11 at 11:59
  • 1
    Greg, for someone who would like to stream live video from the phone, can you recommend a book, open source project or tutorial to do so? Sorry for asking a new question in a comment but i wanted to get your attention. thanks. – nickfox Jul 06 '11 at 20:53
  • 1
    @Greg: could you please add some more details? I have opened a bounty for this question because there are only two answers and they are contradictory about OpenGL. Can you please be a little more specific? For instance, have you tried using glTexSubImage2D() instead of glTexImage2D() to prevent the OpenGL caching issue that you mention? – olivierg Jul 06 '11 at 22:50
  • 2
    @Greg: you said you can play 720p video using this technique? Were you using a software video decoder? I have done that approach for the iPhone and the sw solution (ffmpeg) stopped working reliably at 320p video resolution (maybe 480p on the iPhone 2). On the iPhone I switched to AV Foundation. I am looking to do the same for android but they don't seem to have a nice video editing framework (yet) – KPK Aug 04 '11 at 23:47