It follows the design I'm trying to implement a proper way: I have a JavaScript peer that is sending a video track to a Native Code peer. At some point during the transmission (actually immediately after the connection has been established, but it could be at any moment) I want to start a stopwatch on JS peer side and perform some temporized operations, actually some rendering on a canvas overlaying the video playback. On Native peer side I want to be able to synchronize on the instant the stopwatch started on JS peer, and consider only received frames recorded after that instant, performing some other kind of processing. What I am doing now (fragile and limiting solution):
- As soon as the peers connect, tracking
RTCPeerConnection.iceConnectionState
, I start the stopwatch on the JS peer; - As soon as the first
webrtc::VideoFrame
arrives on the Native peer I store the frame timespam; - On Native peer I use first frame timestamp to compute relative time in a similar way the stopwatch allows me on JS peer.
This design is limiting because I may want to synchronize on any instant, not just on peers connection establishment, and also fragile because I think the WebRTC protocol is allowed to drop the very first received frames for any reason (delays or transmission errors). Ideally I would like to take a timestamp at the chosen synchronization point in the JS peer, send it to the Native peer and be able to compare webrtc::VideoFrame
timestamps. I am unable to do it naively because VideoFrame::timestamp_us()
is clearly skewed by some amount I am not aware of. Also I can't interpret VideoFrame::timestamp()
, which is poorly documented in api/video/video_frame.h
, VideoFrame::ntp_time_ms()
is deprecated and actually always return -1
. What I should do to accomplish this kind of synchronization between the two peers?