I have a situation where I have multiple cameras (rtspsrc
), and a singleton element, that does analytics on the incoming video stream. I call it a singleton element, because it has request source and sink pads. Only one of them should exist in the application, because it does it's work on the GPU, and can get better performance by doing things in batch. Think of the application I'm building as an API to add cameras, remove cameras, turn analytics on and off per camera, etc. Cameras will have analytics done on them, capturing the results, and sending them onwards. The complication being, I need to share a Gstreamer element (the analytics element).
So I have multiple cameras, feeding into this single element, then feeding out, into appsinks. This works reasonably well, but I want to be able to:
- Pause a specific camera
- Have each
rtspsrc
be completely isolated, so errors in one, don't affect the entire pipeline - Listen for events on a particular camera
If I have all the cameras in a pipeline together, I cannot figure out how to pause a specific camera. I cannot pause the entire pipeline, because that will stop all cameras. The best I've come up with is to remove and unlike the elements for a specific cameras, then when resuming, re-add and re-link. This works sort of. If a specific rtspsrc
stops responding, then the entire pipeline stops. If a specific rtspsrc
doesn't exist then the entire pipeline won't transition to PLAYING state
How should I architect my application? Do you think I should have a single big pipeline? Or should I have a pipeline containing the singleton analytics element, and a pipeline per camera, then connect them using appsink and appsrc? This approach might make it easier to handle things, as each pipeline is entirely separate?
Let me know if you need more info.