12

For a live demonstration see: http://codepen.io/rrorg/pen/WxPjrz?editors=0010

When playing a HTTP audio live stream in Safari, the analyser's getByteFrequencyData fills the Array with zeroes.

In all other browsers this works as expected, and Safari has no problems correctly populating the frequency data for static files.

CORS headers are correctly set, the Apple documentation mentions no special cases.

René Roth
  • 1,979
  • 1
  • 18
  • 25
  • Could you give some environment details? Safari version, webkit version, OS etc.. ty – JayIsTooCommon Aug 18 '16 at 09:41
  • @JayIsTooCommon tested with most recent stable releases as of writing, both in Desktop Safari, Mobile Safari via Simulator and Mobile Safari on real device – René Roth Aug 18 '16 at 10:25
  • How about some simple code to demonstrate the problem? – Raymond Toy Aug 22 '16 at 18:37
  • @RaymondToy I linked the demo in the first line :) http://codepen.io/rrorg/pen/WxPjrz?editors=0010 – René Roth Aug 22 '16 at 20:24
  • edited for clarity – René Roth Aug 22 '16 at 20:25
  • Chrome on Android had this problem and maybe Safari does too: MediaElementSource nodes didn't pass the data from the source to WebAudio. – Raymond Toy Aug 22 '16 at 21:10
  • @RenéRoth, Have you found a solution or a workaround? I'm struggling with the this issue and can't find any info on this (maybe, because it is an uncommon task: to visualize streaming audio). I tried to stream response, parse chunks of data with `AudioContext.decodeAudioData()`, but the result s far from OK for now – Michael Romanenko Oct 09 '16 at 19:41
  • @MichaelRomanenko I wish, man, but nothing I tried so far worked! Let alone that I seem to be the only person on the internet with that problem. – René Roth Oct 11 '16 at 10:42
  • @MichaelRomanenko the only option I see left is requesting code level support from Apple themselves, but you can't buy those packages separately, you have to be an Apple Dev. – René Roth Oct 11 '16 at 10:43
  • 1
    Really interesting. Might be a deeper issue / connection here. Just came across this problem and your question reminds me of something. Since 6 years I'm not able to do the same thing with the iOS or macOS system frameworks. I assume Safari uses the `AVPlayer` class to implement the `audio` tag. Here is my most voted question – unanswered since 2013... https://stackoverflow.com/questions/19403584/avplayer-hls-live-stream-level-meter-display-fft-data – Julian F. Weinert Jun 14 '19 at 22:33
  • 1
    Not sure if I needed to mention it, but Safari still does not give FFT data using the current version (as of 2019) when playing (HLS) live streams. Bummer – Julian F. Weinert Jun 14 '19 at 22:35

1 Answers1

2

You're not going to like this: Safari does not support createMediaElementSource.

Source: http://caniuse.com/#feat=audio-api ,

It's due to not support for: http://caniuse.com/#feat=stream

Solution? ...adobe flash :(

Latest Safari nightly webkit seems to have solved this but that doesn't solve the problem for now :/

seahorsepip
  • 4,519
  • 1
  • 19
  • 30
  • But why is createMediaElementSource working, as long as I use a static file and not a stream? The caniuse table btw refers to user stream input (webcam/microphone) - so we're not much closer to a solution now :( but thank you very much for the tip with the Safari nightly, I'll have to check this out! – René Roth Aug 24 '16 at 07:41
  • Just checked in the latest Nightly, problem still persists - please see the demo in my initial question – René Roth Aug 24 '16 at 07:50
  • Hmm I thought I read it was supported on the nightlies but I probably read an old post about chrome webkit nightly then :/ I just checked the safari developer site: https://developer.apple.com/library/safari/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/PlayingandSynthesizingSounds/PlayingandSynthesizingSounds.html and I see they use an xmlhttp request and a buffer method instead of createMediaElementSource. – seahorsepip Aug 24 '16 at 08:18
  • 1
    Problem is, as far as I know, xmlhttp requests will never finish when trying to fetch a stream and can thus not be used, so I'm forced to use an audio tag. – René Roth Aug 24 '16 at 12:12
  • Hmm there's the web audio api but that requires the audio to be encoded in chunks: http://stackoverflow.com/questions/6593738/audio-data-streaming-in-html5 – seahorsepip Aug 24 '16 at 12:38
  • Could you test the following link on safari: https://jsfiddle.net/aLstyxnj/ ? (Don't have safari since I don't have a mac and apple decided to drop safari for windows) – seahorsepip Aug 24 '16 at 12:57
  • Now you just copied my demo from CodePen to JSFiddle... not sure what you expect to behave differently? – René Roth Aug 25 '16 at 10:15
  • 1
    There was a webkit bug that made createMediaElement source null when it was called without window.onload. I copied it to jsfiddle since I could set it to window.onload there, I couldn't find any options on codepen about domready or onload. – seahorsepip Aug 25 '16 at 12:50
  • Ah, thanks for the try! I've been waiting for the canplay event though, so this was unlikely to change anything either way :/ – René Roth Aug 25 '16 at 14:24
  • 2
    As I commented above, this is an underlaying issue with the `AVPlayer` class from the `AVFoundation` system framework from iOS and macOS. I suspect Safari to be implementing using this player. This player has this strange issue with HLS live streams since more than 6 years. Pretty sure Apple knows it (I filed multiple bug reports over the years) but simply doesn't care... – Julian F. Weinert Jun 14 '19 at 22:39