15

I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.

I am receiving a stream via a websocket event

retrieveAudioStream(){
  this.socket.on('stream', (buffer) => {
    console.log('buffer', buffer)
  })
}

the buffer is an arrayBuffer and I need it to be an audioBuffer in order to be able to listen to it on my application.

How can I do this?

Stretch0
  • 8,362
  • 13
  • 71
  • 133
  • 2
    Take a look at this: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData – Get Off My Lawn May 24 '18 at 15:00
  • 1
    According to this SO post https://stackoverflow.com/questions/38589614/webaudio-streaming-with-fetch-domexception-unable-to-decode-audio-data "AudioContext.decodeAudioData just isn't designed to decode partial files". Due to my stream being arrayBuffer chunks, I am not able to decode it with this method. Any other suggestions? – Stretch0 May 24 '18 at 15:39
  • 1
    There is a very [great example that uses MediaSource](https://github.com/nickdesaulniers/netfix/blob/gh-pages/demo/bufferWhenNeeded.html) (which works the same for audio streams) and I use it for streaming audio chunkwise from a 206 response, it works very well. – Jankapunkt Jun 23 '19 at 08:37
  • This might help as well. It's using decodeAudioData to decode the ArrayBuffer in an AudioBuffer and appends the chunks together. https://stackoverflow.com/questions/14143652/web-audio-api-append-concatenate-different-audiobuffers-and-play-them-as-one-son/14148125 – Bart Van Remortele Jun 27 '19 at 08:50

2 Answers2

7

You can use BaseAudioContext.createBuffer() method. It is used to

create a new, empty AudioBuffer object, which can then be populated by data, and played via an AudioBufferSourceNode

See MDN for more info: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer

Orkhan Huseynli
  • 909
  • 10
  • 13
  • 1
    `AudioBuffer` objects require `Float32Array`s, which is not what the OP has (`ArrayBuffer`s representing chunks of data). Is there a method to convert `ArrayBuffer`s of data chunks into `AudioBuffer`s? – AbyxDev Jun 25 '19 at 13:10
  • 1
    I believe you can create any typed array from `ArrayBuffer`. So you must be able to convert `ArrayBuffer` into `Float32Array` – Orkhan Huseynli Jun 25 '19 at 16:44
1

Since you're streaming media rather than downloading the file and then decoding the audio data, AudioContext.createMediaStreamSource() will be much better suited for your usecase.

Read more here https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamSource

Hemant Parashar
  • 3,684
  • 2
  • 16
  • 23