0

I am a beginning Swift programmer with no Objective-C experience. I would like to create a UIView subclass which displays a video stream from an IP camera (RTSP, ONVIF).

It looks like I will need to use the ffmpeg library http://sourceforge.net/projects/ffmpeg-ios/

I did find one ffmpeg tutorial in Japanese ;-) http://qiita.com/tottokotkd/items/d9d376d5993961627aec

Does anyone know of a tutorial on streaming an IP camera using Swift?

European
  • 1
  • 1
  • 1
  • 2

1 Answers1

2

I already answered a similar question here, but since Swift is involved here I'll try to provide an extended answer as far as I can.

First, I guess by referring to the Japanese tutorial you meant this. I've no experience on Switf/C bridging but if the tutorial actually works it seems that having a header file with the C imports (in this case Tutorials-Bridging-Header.h that includes the actual ffmpeg headers) is enough. After that, at least according to the tutorial, you can use the ffmpeg data types and functions in your code (at least that is what happens in Tutorial1.swift - it directly calls avcodec_decode_video2 and others).

If the Swift interop is as easy as it seems then:

1) You need an iOS version of ffmpeg, either use a SourceForge/Github project where you have a XCode project (however, if you need only RTSP and certains codecs you may still need to tweak the project for your needs, since depending on licensing factors you may need to disable some encoders - H.264 in particular) or take the ffmpeg sources and build it yourself using the iOS toolchain, it is actually not that hard (already mentioned in my previous post).

2) Then you need to link with and load ffmpeg (all the av_register_all stuff you see in the tutorials) and feed it the stream:

2a) For RTSP if you now the RTSP url of the stream Googling for avio_open is a good start, you can feed the url right to it, ffmpeg transports and demuxers should take care of the actual connection and then you can extract the data from the streams using av_read_frame, somewhat similar to this.

2b) For ONVIF you will need to actually implement the xml requests to retrieve the stream URI, if it is RTSP then play is a regular RTSP stream, if it is a HTTP stream with a standard content type avio_open should be able to handle it as well.

3) Find the needed decoder in ffmpeg, decode the data obtained from av_read_frame and present it on your view.

Community
  • 1
  • 1
Rudolfs Bundulis
  • 11,636
  • 6
  • 33
  • 71
  • It's possible to bridge C, C++, Objective-C, Objective-C++, and Swift files using bridge headers. Check the link http://stackoverflow.com/questions/24042774/can-i-mix-swift-with-c-like-the-objective-c-mm-files – Onur Tuna Apr 19 '16 at 13:25
  • hi @Rudolfs, i know this is a old answer, but can i talk to you on something with regards to this? – Lysdexia Apr 11 '19 at 02:07
  • @Lysdexia yeah, what is your question? – Rudolfs Bundulis Apr 11 '19 at 07:47
  • how do you "present" it to, for example, UIImageView? – Lysdexia Apr 11 '19 at 22:53
  • Create a bitmap (UIImage) from the decoded pixels and present that bitmap in the view. But that is very ineffective - an OpenGL texture would be much better. – Rudolfs Bundulis Apr 12 '19 at 06:41
  • @RudolfsBundulis sorry, but do you have a simple tutorial on this? currently have an AVPicture data – Lysdexia Apr 15 '19 at 00:56
  • @Lysdexia did you try googling? The second hit for `UIImage AVPicture` seemed good - see this https://github.com/mooncatventures-group/ffmpegDecoder/blob/master/FFmpegDecoder/FrameExtractor.m – Rudolfs Bundulis Apr 15 '19 at 09:07