1

i am trying to live streaming some video content from iphone to the internet (or server). I have read the following post (http://stackoverflow.com/questions/4084811/iphone-http-live-streaming-without-any-server-side-processing)

And i understand i can first capture the images and audio into file then send it out to the internet. But i really have no idea how to start the work on constantly send out these video files.

I understand i can use ffmpeg to do the streaming part. after long researching i can only found a sample program called iFrameExtractor using FFmpeg library. But the sample only shows how to use ffmpeg to playback a video file, but no sample on how to use the live streaming function in ffmpeg...

Can anyone provide a direction or tutorial how to live streaming a video file using ffmpeg? or anyone can suggest other ways to solve this problem? i am sure lots of people want to know how to do that.

Luis Mok
  • 161
  • 3
  • 10

3 Answers3

0

I have done a similar simple demo work, here is how it works:

iOS side
1. using ffmpeg or other sdk to capture small video file
2. send them to server with timestamp with order

server side
1. received file
2. convert them to ts format with a ts file
3. write a html to show ts file

how to view them
access that html with safari

jeff wu
  • 45
  • 6
0

For streaming, you might want to use ffserver. That's a command line tool released by the FFmpeg project and it handles streaming.

sashoalm
  • 75,001
  • 122
  • 434
  • 781
0

Not directly answering your question but have you considered NAT / firewall issues?

Even if you are able to successfully run a HLS (or whatever) server on your iPhone AND your phone is connected to the Internet, it does not necessarily mean the client can connect to it, because of NAT, firewalls etc. This is an important consideration if your iPhone app is going to be used over 3G.

You are probably better off writing a webserver to which the iPhone first uploads unprocessed video frames and then you transcode and run a Media server on your webserver. This would save iPhone's battery life, help avoid porting ffserver on iPhone (it's not easy), and of course allow you to deploy robust media servers on your webserver.

S B
  • 8,134
  • 10
  • 54
  • 108
  • no i am not going to host a server on my iphone. your suggested solution might able to solve the my problem. would you further describe your idea? Like how would you send the individual frames to server and transcode it as a video and broadcast in live. – Luis Mok Jan 25 '12 at 16:40
  • 1) [Capture](http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/CameraAndPhotoLib_TopicsForIOS/Articles/TakingPicturesAndMovies.html) video on iOS 2) As you receive video, upload it on your webserver 3) Use FMS / Red5 to stream. If you are not sure about putting together frames to a video stream, look up webcam grab code in OpenCV – S B Jan 26 '12 at 05:15
  • for step1&3, i can use AVAssetwriter to write a video file (mov, mp4) and use FMS live stream the received video file. But for step2, how to upload to the webserver? http file upload? RTMP? I know iphone doesn't support RTMP natively. there are some RTMP static library could do the trick, like librtmp and rtmpdump. i google over months, i cannot found any examples how to use these library in xCode project. i can only found tutorial how to use these library in Windows, linux, mac etc, but these example shows how to do that in command line ... – Luis Mok Jan 26 '12 at 06:58
  • Also if i want to use the RTMP send the stream to a media server like wowza. how can i convert the mov file to flv? some people claim FFmpeg can do that, but again the iFrameExtrator xcode example only show how to play a recorded video. but no example on how to roll the video file to flv or how to establish a connect via RTMP. I really hope to have some practical example. – Luis Mok Jan 26 '12 at 07:06
  • Use http PUT for file upload for step 2 – S B Jan 26 '12 at 11:24