24

I'm receive raw float32 audio through websockets and would like to playback this in the browser. From my understanding I would need to to use MediaStream API for this. However, I cannot find a way to create a MediaStream which I can append data buffers to.

What is the proper way to achieve this?

I'm trying something like this:

    var context = new AudioContext();

    context.sampleRate = 48000;

    var stream = null; // ????

    var source = context.createMediaStreamSource(stream);
    source.connect(context.destination);
    source.start(0);

    socket.onmessage = function (event) {
        stream.appendBuffer(new Float32Array(event.data)); // ????
    };
David Jones
  • 10,117
  • 28
  • 91
  • 139
ronag
  • 49,529
  • 25
  • 126
  • 221
  • Do you have any feedbacks few years later? Trying to do the same and it still doesn't seem to be possible to 'appendBuffer' on a stream currently... – bertrandg May 23 '17 at 13:50
  • 1
    @ronag I added a bounty to this question because I'd love to know the answer. I also made a couple of edits to the code to bring it up to date. Do you happen to know the answer to this in 2018? – David Jones Oct 24 '18 at 19:50
  • 1. context.sampleRate is a read only property. [smapleRate docs](https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext#Properties) – dRoyson Oct 27 '18 at 15:53
  • 2. stream has to be a MediaStream object. You can create this using constructor. `stream = new MediaStream()`. [MediaStream docs](https://developer.mozilla.org/en-US/docs/Web/API/MediaStream) – dRoyson Oct 27 '18 at 15:56
  • 3. stream source has no `start` method. Reference for [source](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamAudioSourceNode#Methods) and its [parent class](https://developer.mozilla.org/en-US/docs/Web/API/AudioNode#Methods) – dRoyson Oct 27 '18 at 15:58
  • I'm not sure how to use MediaStream with but it does not have an `appendBuffer` method. [Reference](https://w3c.github.io/mediacapture-main/). But you can refer the following [example](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamAudioSourceNode#Example) – dRoyson Oct 27 '18 at 16:02
  • 2
    I think this can help a bit https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer – Lmoro Oct 30 '18 at 00:09
  • @DavidJones, OP if still interested and others, What does your Buffer holds? Raw PCM data? Some encapsulated data? Others? If not raw PCM, then don't go the WebAudio way, you might rather be interested in [MediaSource Extension](https://developer.mozilla.org/en-US/docs/Web/API/MediaSource). – Kaiido Nov 01 '18 at 10:57

2 Answers2

6

You should use the AudioBuffers to read sound from the buffers from the websocket and play it.

var context = new AudioContext();
var sampleRate = 48000;
var startAt = 0;

socket.onmessage = function (event) {
    var floats = new Float32Array(event.data);
    var source = context.createBufferSource();
    var buffer = context.createBuffer(1, floats.length, sampleRate);
    buffer.getChannelData(0).set(floats);
    source.buffer = buffer;
    source.connect(context.destination);
    startAt = Math.max(context.currentTime, startAt);
    source.start(startAt);
    startAt += buffer.duration;
};

This plays the music from a websocket.

To convert an AudioBuffer into a MediaStream, use AudioContext.createMediaStreamDestination(). Connect the BufferSource to it to make the custom MediaStream based on the buffer's data.

var data = getSound(); // Float32Array;
var sampleRate = 48000;
var context = new AudioContext();

var streamDestination = context.createMediaStreamDestination();
var buffer = context.createBuffer(1, data.length, sampleRate);
var source = context.createBufferSource();

buffer.getChannelData(0).set(data);
source.buffer = buffer;
source.connect(streamDestination);
source.loop = true;
source.start();

var stream = streamDestination.stream;

This reads audio from the data array and converts it into a MediaStream.

daz
  • 714
  • 6
  • 9
-1

Regarding decoding, audioContext from the window object should do the job.

var audioCtx = new (window.AudioContext || window.webkitAudioContext)();

and then

audioCtx.decodeAudioData(audioData, function(buffer) {

directly on the binary array.

Regarding communication, I'd rather use XMLHttpRequest (a low level function and old) and using the response directly.

This is a pretty good function made by MDM guys (I updated the url of the ogg file so you can test it directly) :

function getData() {
  source = audioCtx.createBufferSource();
  request = new XMLHttpRequest();
  request.open('GET', 'https://raw.githubusercontent.com/mdn/webaudio-examples/master/decode-audio-data/viper.ogg', true);
  request.responseType = 'arraybuffer';
  request.onload = function() {
    var audioData = request.response;
    audioCtx.decodeAudioData(audioData, function(buffer) {
        myBuffer = buffer;
        songLength = buffer.duration;
        source.buffer = myBuffer;
        source.playbackRate.value = playbackControl.value;
        source.connect(audioCtx.destination);
        source.loop = true;
        loopstartControl.setAttribute('max', Math.floor(songLength));
        loopendControl.setAttribute('max', Math.floor(songLength));
      },
      function(e){"Error with decoding audio data" + e.error});
  }
  request.send();
}

the full source code is here :

https://raw.githubusercontent.com/mdn/webaudio-examples/master/decode-audio-data/index.html

C.Vergnaud
  • 857
  • 6
  • 15
  • This method not works good if you want to decode media content in chunks, I mean when you have to call decodeAudioData() multiple times. Each decodeAudioData() produce short silence on the first decoded frame(at least in MP3). So decodeAudioData() is good solution only and only if you want to decode all content at once. – crayze Nov 30 '18 at 15:01