25

The iPad/iOS has video streaming support for e.g. H.264 using MPMoviePlayerController etc., but i receive H.264 data through a custom, proprietary, stream and need to decode it in a soft real-time scenario.
Can the iPads/iOS' video decoder be accessed in any way to decode this data?

Update: Apparently the iOS 4.0 Core Media Framework supports decoding frames and knows of H.264, but there is no sample code nor can i see what i actually am supposed to call for the actual decoding.


Update (ten years later!)

For anyone googling here, you do this in iOS these days with "VideoToolbox".


Fattie
  • 27,874
  • 70
  • 431
  • 719
Georg Fritzsche
  • 97,545
  • 26
  • 194
  • 236
  • 1
    AVAssets don't support streaming yet. It seems that the low level Core Media Framework is the place to look. The H.264 codec is defined there, which is a good sign, and there are block buffering structures where you don't have to store the entire asset in memory or on disk. I'd like to see code for any part of this, especially for the part where a sample buffer is coordinated with a layer or captured. – Peter DeWeese Sep 01 '10 at 17:05
  • @Peter: Thanks, that looks more promising. It seems to be supposed to support decoding to `CVImageBuffer`, but i don't really see what functions are for the actual decoding. – Georg Fritzsche Sep 03 '10 at 17:50
  • Didn't the MoviePlayer demo helped you? http://developer.apple.com/iphone/library/samplecode/MoviePlayer_iPhone/Introduction/Intro.html – karlphillip Sep 08 '10 at 16:10
  • @karl: That just passes an URL to a high-level class which means it has to be a *specific kind of stream*. I have a *custom, non-standard, stream* over which i receive movie samples and need to decode those samples. – Georg Fritzsche Sep 08 '10 at 16:13
  • Did you find a suitable solution to this question? I am also looking to achieve something similar. – Sander Mar 06 '12 at 11:32
  • @Sander: For the moment - until Apple possibly opens the APIs - you have to roll your own solution. – Georg Fritzsche Mar 06 '12 at 11:56
  • How did you see the Core Media Framework supports decoding frames? I read the documentation you linked but could only find reference for storage of compressed frames, not the actual decoding itself. – Cthutu Oct 10 '12 at 19:54

5 Answers5

12

After raising the issue with Apple DTS it turns out that there currently is no way to decode video data from custom stream sources.

I will file an enhancement request for this.

Georg Fritzsche
  • 97,545
  • 26
  • 194
  • 236
  • Is the enhancement request open to the public? I'd like to vote for it :) – Gili Sep 14 '12 at 21:19
  • @Gili: No, but you could just open a bug / enhancement request on this too. – Georg Fritzsche Nov 02 '12 at 09:18
  • Hi, Georg. I know that quite a bit of time has passed since then, but still there is no good answer for this question. Could you please share if there is any way of using custom stream sources? Thanks a million! – Lukasz Czerwinski Jun 17 '15 at 02:44
6

If you continue to have problems with it, I suggest you take a look at libavcodec for decoding the data (available on the ffmpeg project).

There are great ffmpeg tutorials at dranger that show how to properly decode (through libavcodec) and display video data (using libsdl), among other things.

karlphillip
  • 92,053
  • 36
  • 243
  • 426
  • I wasn't looking for alternatives at this point, but thanks anyway. – Georg Fritzsche Sep 10 '10 at 06:51
  • Interesting. However, libavcodec is LGPL, so as I understand it, one cannot use it for commercial iOS apps (since it needs to be statically linked). Or do I misunderstand something? – Sander Mar 06 '12 at 11:31
  • 1
    This [fine thread](https://news.ycombinator.com/item?id=3341852) discusses some of the issues. – karlphillip Mar 06 '12 at 11:42
  • 1
    There are some real legal issues with linking ffmpeg libs into your iOS app, just calling it a "Framework" does not change anything. See http://multinc.com/2009/08/24/compatibility-between-the-iphone-app-store-and-the-lgpl/ for more info. ffmpeg works great on the server or on the desktop, but it is not a solution for linking into an iOS app. – MoDJ Aug 11 '13 at 18:58
  • These days (2019), compiling ffmpeg in to iOS does work (with some technical problems reading some streams). But yes the legal problems remain. – Fattie Feb 01 '19 at 12:47
4

2019

There are two solutions

  1. Do it "by hand" which means using AVFoundation and in particular VideoToolbox.

To get going with that you basically start with https://developer.apple.com/videos/play/wwdc2014/513/ Enjoy!

I have to say, that is really the "correct and better" solution.

  1. If you can get ffmpeg-api working inside your iOS app, you can use ffmpeg, FFmpeg will do hardware decoding after some fiddling.

There are a number of ways to get started with that. (One absolutely amazing new thing is the SWIFT ffmpeg made by sunlubo: https://github.com/sunlubo/SwiftFFmpeg )

Be aware with the "ffmpeg" approach that there are, in short, a number of legal/license issues with ffmpeg / iOS. One can search and read about those problems.

However on the technical side, these days indeed it is possible to compile ffmpeg right in to iOS, and use it raw in your iOS code. (Using a C library may be easiest.)

We just did an enormous project doing just this, as well as other approaches. (I never want to see FFmpeg again!)

You can in fact achieve actual hardware decoding, in iOS, using FFmpeg.

We found it to be incredibly fiddly. And a couple of bugs need to be patched in FFmpeg. (I hope I never see videotoolbox.c again :/ )

So once again your two options for hardware decoding in iOS are

  1. Do it "by hand" AVFoundation/VideoToolbox.

  2. Use FFmpeg.

Item 2 is incredibly fiddly and uses a lot of time. Item 1 uses a huge amount of time. Tough choice :/

Community
  • 1
  • 1
Fattie
  • 27,874
  • 70
  • 431
  • 719
  • This suggestion to use ffmpeg fails on a couple of fronts. First, the license. Most companies do not want LGPL or GPL code linked into the app. Second, the runtime performance. The ffmpeg code is C code, it is way way slower than the hardware built into iOS devices. – MoDJ May 16 '19 at 04:19
  • Hi @MoDJ - regarding licensing, um, did you read the paragraph about licenses? :) – Fattie May 16 '19 at 12:17
  • hi @MoDJ - regarding performance. Of course you use hardware decoding (i.e., in FFmpeg). On iOS we found it incredibly fiddly to ensure that FFmpeg does use hardware decoding. – Fattie May 16 '19 at 12:23
2

With iOS 8, you can use video toolbox (https://developer.apple.com/reference/videotoolbox) to decode H264 to raw frames. VT APIs are hardware accelerated and will provide you much better performance when compared with libavcodec. If you want to play the frames or generate a preview, you can use eagl based renderer to play. I have written a sample app to encode the frames from raw to h.264 (https://github.com/manishganvir/iOS-h264Hw-Toolbox). h.264 to raw shouldn't be that difficult !

manishg
  • 9,520
  • 1
  • 16
  • 19
  • One thing, you CAN these days, if you struggle, get ffmpeg to do hardware decoding, on iOS. – Fattie Feb 01 '19 at 12:52
  • that is a fantastic example app. I believe it is actually ***the one and only example, on Earth, of someone actually getting VT working on iOS***. Amazing! – Fattie Feb 01 '19 at 12:53
  • 1
    Many VideoToolbox example apps exist, here is one for seamless looping HD content. https://github.com/mdejong/H264SeamlessLooping – MoDJ May 16 '19 at 04:23
1

Have you tried writing the H.264 stream that you receive from your protocol to a temporary file which you continually append to, and then once you have written enough bytes to avoid buffering playback, passing the url of your temp file to MPMoviePlayerController?

Jason Jenkins
  • 5,332
  • 3
  • 24
  • 29