3

I need a way to play a live audio stream using HTML5 (primarily in Google Chrome), so I tried using the following:

<audio>
    <source src="my-live-stream.ogg" type="audio/ogg">
</audio>

While this does work for a live stream, there seems to be a very large, uncontrollable delay/buffer of around 30 seconds or so when this is played.

I need the delay to be a couple of seconds or less so this method doesn't work.

As an alternative I have tried sending the audio over a WebSocket connection as 1 second individual audio files, which are then appended to a SourceBuffer and played in an audio element using Media Source Extensions.

After experimenting with a number of formats (MediaSource.isTypeSupported seems to be rather limited in audio support), I got this working using a Vorbis audio stream in a WebM container, which sounds perfect with no audible gaps. Other formats worked less well as they need to be gapless - e.g. MP3 and AAC end up with tiny audible gaps between each 1 second segment.

While this seems to work at first, when looking at chrome://media-internals, the following errors repeatedly appear:

00:00:09 544    info    Estimating WebM block duration to be 3ms for the last (Simple)Block in the Cluster for this Track. Use BlockGroups with BlockDurations at the end of each Track in a Cluster to avoid estimation.
00:00:09 585    error   Large timestamp gap detected; may cause AV sync to drift. time:8994999us expected:9231000us delta:-236001us
00:01:05 239    debug   Skipping splice frame generation: not enough samples for splicing new buffer at 65077997us. Have 1us, but need 1000us.

Eventually the playback stops as though the pause button has been pressed on the audio element. It still shows the pause rather than play button, but the current time stops advancing:

Audio element

Pressing the pause button and then the play button that replaces it doesn't make it start playing again, but manually dragging the position slider further ahead makes it continue playing.

I have tried setting sourceBuffer.mode = 'sequence'; but this doesn't seem to help.

Is there anything that needs to be changed in how the audio files are being encoded, or how they are played back in JavaScript to fix this?


Additional details:

  1. The audio stream is encoded into 1 second WebM/Vorbis files using FFmpeg on Windows.
  2. A background worker is used in the browser to receive the audio segments and pass them to the main page thread, which appends them to the audio stream. Otherwise the playback freezes.

Source code:

user882807
  • 373
  • 1
  • 4
  • 17
  • 1
    See [HTML5 audio streaming: precisely measure latency?](http://stackoverflow.com/questions/38768375/html5-audio-streaming-precisely-measure-latency) – guest271314 Oct 17 '16 at 22:36

0 Answers0