2

I'm trying to synchronize an audio track being played via Web Audio API with a video being played in an HTML5 video element. Using a fibre optic synchronization device and audacity we can detect the drift between audio and video signals to a very high degree of accuracy.

I've tried detection the drift between the two sources and correcting it by either accelerating or decelerating the audio and as below just simply setting the video to the same position as the audio.

Play(){
//...Play logic
//Used to calculate when the track started playing, set when play is triggered.
  startTime = audioContext.currentTime;
}

Loop(){

  let audioCurrentTime = startTime - audioContext.currentTime;

  if(videoElement.nativeElement.currentTime - audioCurrentTime > 0.1 ){
    videoElement.nativeElement.currentTime = audioCurrentTime;
  }

  requestAnimationFrame(Loop);
}

With all of this, we still notice a variable drift between the two sources of around 40ms. I've come to believe that audioContext.currentTime does not report back accurately since when stepping through the code multiple loops will report back the same time even though quite obviously time has passed. My guess is the time being reported is the amount of the track that has been passed to some internal buffer. Is there another way to get a more accurate playback position from an audio source being played?

Edit: I've updated the code to be a little closer to the actual source. I set the time at which the play was initialized and compare that to the current time to see the track position. This still reports a time that is not an accurate playback position.

  • 1
    AudioContext.currentTime reports the internal time of the AudioContext, and that's it. It doesn't tell anything about what it is being played, and you neither. Where does the audio source come from? What WebAudioAPI object is it? – Kaiido Jan 26 '18 at 01:26
  • Related [HTML5 audio streaming: precisely measure latency?](https://stackoverflow.com/questions/38768375/html5-audio-streaming-precisely-measure-latency/) – guest271314 Jan 26 '18 at 01:27
  • @Kaiido I'm creating an audioContext, then decoding a wav file and loading that into a buffer in the context object, connecting a gain node and starting the playback on the buffer source. – Jeff Langston Jan 26 '18 at 21:04
  • @guest271314 Unfortunately not, I'm using web audio api and not HTML audio objects. Thanks though. – Jeff Langston Jan 26 '18 at 21:04

0 Answers0