6

I can listen and receive one rtsp stream with FFMpeg library using this code:

AVFormatContext* format_context = NULL
char* url = "rtsp://example.com/in/1";
AVDictionary *options = NULL;
av_dict_set(&options, "rtsp_flags", "listen", 0);
av_dict_set(&options, "rtsp_transport", "tcp", 0);

int status = avformat_open_input(&format_context, url, NULL, &options);
av_dict_free(&options);
if( status >= 0 )
{
    status = avformat_find_stream_info( format_context, NULL);
    if( status >= 0 )
    {
        AVPacket av_packet;
        av_init_packet(&av_packet);

        for(;;)
        {                                                                      
            status = av_read_frame( format_context, &av_packet );
            if( status < 0 )
            {
                break;
            }
        }
    }
    avformat_close_input(&format_context);
}

But if I try to open another similar listener (in another thread with another url) at the same time, I get error:

Unable to open RTSP for listening rtsp://example.com/in/2: Address already in use

It looks like avformat_open_input tries to open socket which is already opened by previous call of avformat_open_input. Is there any way to share this socket between 2 threads? May be there is some dispatcher in FFMpeg for such task.

Important Note: In my case my application must serve as a listen server for incoming RTSP connections! It is not a client connecting to another RTSP server.

Alexander Ushakov
  • 5,139
  • 3
  • 27
  • 50
  • If I understand correctly main RTSP socket is used only for command messages and in underlay level of RTSP all data is send in independent RTP streams which doesn't interfere. So it doesn't look difficult to process this streams separately. – Alexander Ushakov May 15 '17 at 12:34
  • 1
    It doesn't look like FFmpeg allows you to access the socket directly. However, you might be able to use the [`async` protocol handler](https://ffmpeg.org/ffmpeg-protocols.html#async) for this task. I'm not super familiar with the code, so not offering this as an answer. Apologies for misunderstanding your question earlier. – dho May 19 '17 at 21:11
  • async protocol handler looks interesting. Thank you – Alexander Ushakov May 20 '17 at 04:06
  • This is confusing. Title says _"How to **listen to 2 incoming rtsp** streams at the same time with FFMpeg"_ and description confirms same thing BUT then you throw the _"**Important Note:** It is not a client connecting to another RTSP server"_ which is confusing. Is your app going to SEND OUT to RTSP connections? (_eg_: I put URL into my VLC media player, now I'm an incoming RTSP connection, and your app will give me video bytes?) If yes then shoudn't you be using FFserver? – VC.One May 24 '17 at 20:45
  • No I don't want to send any RTSP streams anywhere. Note was added after first commenters thought that incoming streams were created by my application (were pulled). But In my case connections are pushed to my application. – Alexander Ushakov May 24 '17 at 20:56
  • FFserver is abandoned (see here http://ffmpeg.org/index.html#news news "July 10th, 2016, ffserver program being dropped") because of it pure internal design. Also I need code to embed in my application not standalone server. – Alexander Ushakov May 24 '17 at 21:01
  • Okay I had typed something (Answer). Let me know if it's useful to you or else I will delete it... – VC.One May 24 '17 at 21:14
  • Clarification on what you mean when you say pushed to your application would be helpful. – John Jones May 24 '17 at 23:39
  • @JohnJones Pushed to my application = connection is established by some other program. My application is just waiting for such incoming streams. Push here means Push technology as opposite to Pull technology. – Alexander Ushakov May 25 '17 at 18:12
  • Posting your actual code would probably make helping a lot easier. It seems like you either don't understand rtsp or you're explaining what you're attempting poorly. – John Jones May 25 '17 at 18:37
  • My actual code is in the question. Except of processing of received frames. – Alexander Ushakov May 25 '17 at 21:13
  • That would be the part of this puzzle that I'd like to see – John Jones May 25 '17 at 21:27
  • Processing of received frames has no relation to question. There is no need to go away from topic. – Alexander Ushakov May 25 '17 at 21:33
  • It absolutely does, if you aren't doing the RTSP negotiation here then where is it happening? If you're doing the handshake somewhere else then what this part of the code is receiving is not even an RTSP stream. – John Jones May 25 '17 at 22:41
  • @JohnJones All RTSP negotiation is made by ffmpeg in avformat_open_input. – Alexander Ushakov May 26 '17 at 08:41
  • Exactly, so what do you mean you're receiving the streams somewhere else if your doing the client-server interaction here? I'm just trying to figure out how your code works – John Jones May 26 '17 at 17:04
  • @JohnJones I didn't said that I'm "receiving the streams somewhere else". The only note was that rtsp streams are started by remote software - my application only waits and receives such streams. My code allow to receive one stream. And I'm searching method to tell ffmpeg to receive second stream on this port. – Alexander Ushakov May 27 '17 at 08:58
  • Cool, so @VC.One's answer looks like the solution to me. Instead of doing the listener in code, it's much easier to just spawn threads for the new streams how they show. Unless there's a reason you want the listeners to be in the same process? – John Jones May 30 '17 at 22:41

3 Answers3

2

You should look into FFserver if you want your app to act as server that listens for and sends data to multiple incoming RTSP connections.

Below attempts to provide useful resource info towards solving the title of the question.

..."How to listen to 2 incoming rtsp streams at the same time with FFMpeg"

This guy asking on the FFmpeg forums managed to receive data from two RTSP streams on the commandline : http://ffmpeg.gusari.org/viewtopic.php?f=11&t=3246 (see text after the 3 images).

What I already achieved is to receive two streams via rtsp:

Server code :

ffmpeg -loop 1 -re -i ~/Desktop/background.png -rtsp_flags listen -timeout -1 -i rtsp://localhost:5001/live.mp4 -rtsp_flags listen -timeout -1 -i rtsp://localhost:5002/live.mp4 -filter_complex \
"[1:v] setpts=PTS-STARTPTS [left]; \
[2:v] setpts=PTS-STARTPTS [right]; \
[0:v][left]overlay=0:eof_action=pass:shortest=0 [bgleft]; \
[bgleft][right]overlay=w:eof_action=pass:shortest=0" ~/Desktop/test.mp4

and I faked two stream clients with :

ffmpeg -re -i ~/Desktop/normal.mp4 -f rtsp rtsp://localhost:5001/live.mp4
ffmpeg -re -i ~/Desktop/normal.mp4 -f rtsp rtsp://localhost:5002/live.mp4

Well it's working somehow. The Server starts and it's waiting for incoming connections. When both clients are connected, the ffmpeg server puts the streams together and outputs them in test.mp4. If one client stops, the red background appears and the video is continuing.

Unfortunately I only use FFmpeg in command line, not as a C library so cannot provide code. But it's just a different way to access the same features.

VC.One
  • 14,790
  • 4
  • 25
  • 57
  • Using different ports for listening may be an option but in this case remote clients must know which port to use or have some sort of fallback to change port if connection failed. Unfortunately for me I can't modify client software. – Alexander Ushakov May 25 '17 at 18:17
0

I admittedly know nothing of the library, but it seems to me that by setting your context to NULL you're getting whatever the defaults are. You might try explicitly creating a context. A quick glance at the docs suggests avformat_alloc_context might be what you're looking for.

Here's someone else's code where they allocate the contexts, might be confirmation bias at work but this seems important.

Alternatively, if faced with the same challenge (and since you haven't provided end-game details, I'm taking creative liberty) an easier approach to this might be to use ffmpeg/libav's cli to pull the stream and mux on your audio device. So on incoming connection, spawn a ffplay with the received URL.

Edit: So, you're passing a URL into avformat_open_input, seems an awful lot like you when you say streams are pushed to your application that what you are actually receiving is a URL to a stream. Otherwise you'd be doing some kind of shenanigans with negotiating the RTSP stream elsewhere and then restreaming it to ffmpeg? That doesn't make any sense to me, just let ffmpeg handle it.

It would help to see what you've got implemented already if you could post some of it.

John Jones
  • 2,027
  • 16
  • 25
  • 1. For network sources it is common practice to give format_context = NULL to avformat_open_input. Even if I create AVFormatContext myself there are no methods in it to control network IO the same way as we can control file IO with pb field. 2. I can't pull streams from anywhere because streams are pushed to my application. See Note at the end of the question. – Alexander Ushakov May 23 '17 at 08:45
-2

You must check the video device info.The reason is that not every device can stream more than one.There is a restriction.But most of the devices may stream to multiple channels such as, main_stream,channel_1,channel_2,mjpeg_stream etc.So you can listen different channels at the same time.This may solve your problem. P.S: Channel names depend on device's manufacturer.

Berkay
  • 1
  • 1
  • 2
  • I doesn't try to stream anything. I try to receive several videos at the same time. I can successfully receive them on different interfaces (i.e. different ip addresses) but can't on one. P.S. I use normal Linux server not embeded. – Alexander Ushakov May 15 '17 at 12:55