I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial). FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching
ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234
where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
However I didn't managed to watch the stream directly on the iPhone.
I thought of different (possible) solutions:
first solution: store incoming data in a
NSMutableData
object. Then, when the stream ends, store it and then play it using aMPMoviePlayerController
. Here's the code:[video writeToFile:@"videoStream.m4v" atomically:YES]; NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"]; MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url]; [videoController.view setFrame:CGRectMake(100, 100, 150, 150)]; [self.view addSubview:videoController.view]; [videoController play];
the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.
Second solution: use
CMSampleBufferRef
to store the incoming video. Much more problems comes with this solution: first of all, there's noCoreMedia.framework
in my system. Besides I do not get well what does this class represents and what should I do to make it works: I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call theCMSampleBufferMakeDataReadyCallback
function I set during creation? If yes, when? When the single frame is completed or when the whole stream is received?Third solution: use
AVFoundation
framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from aNSMutableData
, achar*
or something like that. OnAVFoundation Programming Guide
I didn't find any reference that say if it's possible or not.
I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.
Besides, there's also another problem: I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg?
UPDATE: I edited my question adding more informations on what I did since now and what my doubts are.