1

I'm interested in creating an iOS audio visualizer, not one that uses AVPlayer or any other similar derivatives, but one that is able to parse through an audio signal and create an audio visualization regardless of the application that is playing it. For instance, if Spotify is playing, or iTunes is playing. From whatI understand, you don't have access to that stream pragmatically unless the application specifically allows it. Another approach I was thinking of was to use the microphone, but from what I have noticed, using the Microphone kill the audio output ... not sure if this is optional or not. Perhaps I missing something about how the iOS audio system works. Is there any way to achieve what I am trying to do.

user379468
  • 3,989
  • 10
  • 50
  • 68

1 Answers1

0

Only if the audio output app is exporting the audio data using Inter-App-audio or Audiobus. Otherwise the iOS security sandbox will hide that audio output from your app.

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • Can you elaborate on any particular limitations around inter-app audio? Does Apple Music support IAA output for visualization? And is there anything special you have to do in your app to take advantage of this? – Petrus Theron May 06 '18 at 20:19
  • Which non-trivial API are you referring? Is there a way to get at the current Apple Music for signal analysis? – Petrus Theron May 07 '18 at 07:20
  • 1
    Apple Music does not seem to support Inter-App audio or AudioBus. Inter-App audio is non-trivial API to take advantage of. But there are no other choices currently, AFAIK. – hotpaw2 May 07 '18 at 14:01