I'm using Icecast to stream live audio from internal microphones and want the listener to have as small a latency as possible.
A naive solution would be to simply access http://myhostname:8000/my_mountpoint
to get the stream but the <audio>
tag does internal buffering before playing and results in a pretty high latency.
Current solution: I used the ReadableStreams
API to decode (using decodeAudioData
of Web Audio API) and play chunks of data by routing the decoded data to an Audio Context destination (internal speakers). This works and brings down the latency significantly.
Problem: This streams API, while experimental, should technically work on the latest Chrome, Safari, Opera, FF (after setting a particular flag). I'm however having problems with decodeAudioData
in all other browsers except Chrome and Opera. My belief is that FF and Safari cannot decode partial MP3 data because I usually hear a short activation of the speakers when I start streaming. On Safari, the callback on a successful decodeAudioData
is never called and FF simply says EncodingError: The given encoding is not supported.
Are there any workarounds if I want to at least get it to work on Safari and FF? Is the decodeAudioData
implementation actually different on Chrome and Safari such that one works on partial MP3 and the other doesn't?