22

I have an audio file/blob that has been created using the MediaRecorder api:

let recorder = new MediaRecorder(this.stream)
let data = [];
recorder.ondataavailable = event => data.push(event.data);

and then later when the recording is finished:

let superBlob = new Blob(data, { type: "video/webm" });

How can I use this blob to create an AudioBuffer? I need to either :

  • Transform the Blob object into an ArrayBuffer which I could use with AudioContext.decodeAudioData (returns an AudioBuffer) or
  • Transform the Blob object into an Float32Array, where I could copy it into the AudioBuffer with AudioBuffer.copyToChannel()

Any tips on how to achieve that are appreciated. Cheers!

Maxime Dupré
  • 5,319
  • 7
  • 38
  • 72

5 Answers5

17

To convert a Blob object to an ArrayBuffer, use FileReader.readAsArrayBuffer.

let fileReader = new FileReader();
let arrayBuffer;

fileReader.onloadend = () => {
    arrayBuffer = fileReader.result;
}

fileReader.readAsArrayBuffer(superBlob);
Maxime Dupré
  • 5,319
  • 7
  • 38
  • 72
  • I am using this in my implementation but the `fileReader.result` always returns an empty `ArrayBuffer` - do you know what could be happening? Further details can be found in [this question that I asked](https://stackoverflow.com/questions/43572151/web-audio-blob-to-arraybuffer-converstion-results-in-empty-arraybuffer-array) – Alistair Hughes Apr 23 '17 at 15:02
  • Are you sure you are checking `fileReader.result` in the `onloadend` event? – Maxime Dupré Apr 23 '17 at 17:14
  • 2
    Yea I've managed to get it working, it was a trivial mistake on my end so sorry about that. Thanks a lot for posting this answer though, it has been very helpful! – Alistair Hughes Apr 24 '17 at 10:24
  • not sure how trivial the mistake was. I found that you can't access the resulting array buffer directly. I used data = new Uint8Array(reader.result); – bobbdelsol Sep 28 '17 at 17:45
  • As stated by PAT-O-MATION, this returns an arrayBuffer, not audioBuffer – Will59 Oct 10 '20 at 22:11
10

The accepted answer is great but only gives an array buffer which is not an audio buffer. You need to use the audio context to convert the array buffer into an audio buffer.

const audioContext = AudioContext()
const fileReader = new FileReader()

// Set up file reader on loaded end event
fileReader.onloadend = () => {

    const arrayBuffer = fileReader.result as ArrayBuffer

    // Convert array buffer into audio buffer
    audioContext.decodeAudioData(arrayBuffer, (audioBuffer) => {

      // Do something with audioBuffer
      console.log(audioBuffer)

    })

}

//Load blob
fileReader.readAsArrayBuffer(blob)

I wish the answer had included an example using decodeAudioData. I had to find it somewhere else and I thought since this is the top search for "Blob to Audio Buffer" I would add some helpful information for the next person that comes down this rabbit hole.

PAT-O-MATION
  • 1,907
  • 1
  • 20
  • 20
  • do not use this answer, https://stackoverflow.com/questions/66450267/webaudioapi-decodeaudiodata-giving-null-error-on-ios-14-safari, `decodeAudioData` only works with file – Weilory Sep 05 '22 at 13:19
5

All the answers are true. However, in the modern web browsers like Chrome 76 and Firefox 69, there is a much simpler way: using Blob.arrayBuffer()

Since Blob.arrayBuffer() returns a Promise, you can do either

superBlob.arrayBuffer().then(arrayBuffer => {
  // Do something with arrayBuffer
});

or

async function doSomethingWithAudioBuffer(blob) {
  var arrayBuffer = await blob.arrayBuffer();
  // Do something with arrayBuffer;
}
soundlake
  • 321
  • 4
  • 10
  • Just tried that, and what you get back is an arrayBuffer, not an audioBuffer – Will59 Oct 10 '20 at 22:12
  • @Will59 Would you read the question more carefully? The questioner is aware of how to convert an ArrayBuffer into an AudioBuffer. The question isn't about getting an AudioBuffer from a Blob. – soundlake Oct 11 '20 at 21:48
  • Read your code more carefully then... Don't write 'audioBuffer' where you get in fact an arrayBuffer. – Will59 Oct 12 '20 at 07:41
  • Thanks for taking the time – Will59 Oct 12 '20 at 16:50
3

A simplified version using an async function:

async function blobToAudioBuffer(audioContext, blob) {
  const arrayBuffer = await blob.arrayBuffer();
  return await audioContext.decodeAudioData(arrayBuffer);
}

I put audioContext as a param, because I recommend reusing instances.

Erik Hermansen
  • 2,200
  • 3
  • 21
  • 41
  • https://stackoverflow.com/questions/69452545/safari-15-fails-to-decode-audio-data-that-previous-versions-decoded-without-prob – Weilory Sep 05 '22 at 13:52
1

Both Answers are true, there are some minor changes. This is the function I finally used:

function convertBlobToAudioBuffer(myBlob) {

  const audioContext = new AudioContext();
  const fileReader = new FileReader();

  fileReader.onloadend = () => {

    let myArrayBuffer = fileReader.result;

    audioContext.decodeAudioData(myArrayBuffer, (audioBuffer) => {

      // Do something with audioBuffer

    });
  };

  //Load blob
  fileReader.readAsArrayBuffer(myBlob);
}
amitgur
  • 407
  • 6
  • 11