0

I'm using a navigator.mediaDevices.getUserMedia({ audio: true }) to get the user microphone. This is my code:

navigator.mediaDevices.getUserMedia({ audio: true })
    .then(stream => {
        mediaRecorder = new MediaRecorder(stream);
        mediaRecorder.start();

        mediaRecorder.addEventListener("dataavailable", event => {
          audioChunks.push(event.data);
        });

        mediaRecorder.addEventListener("stop", () => {
            blob = new Blob(audioChunks);
            audioUrl = URL.createObjectURL(blob);
        });
    });

Then when the user clicks the stop button, it does this:

mediaRecorder.stop();
addSound(audioUrl);

Then addSound function creates an object with the audio and some other info and puts it into an array. Then, I have a loop that works as a sequencer, this is the code:

function loop(){
    secondTimer += (1/40);

    for(var i = 0; i < audios.length; i++){
        var audio = audios[i];
        var start = audio.start;

        if(!audio.played && secondTimer >= start){
            audio.audio.play();
            audio.played = true;
        }
    }

    if(play == true){
        setTimeout(loop, 25);   
    }       
}

Basically it checks if the current time from the sequencer is greater or equal than the start time from the audio. If it is, then it plays the audio and set its "played" property to true so it can't be played twice.

The problem is, the audio I recorded with the mic is not playing. However, if I create the audio object using an mp3 file for example, it works.

Another thing to note is that if I do audio.play() on the addSound function it works. Also, when I go to chrome dev tools and try to do audios[x].audio.play() it also works. So it's only not working when it's called from the loop function, which is weird.

This is the source from the audio element shown when console logging it:

blob:http://localhost/4a7bb4a3-1940-496c-a545-a956d1ccbd57

If its any help.

Thanks in advance!

nick
  • 2,819
  • 5
  • 33
  • 69
  • You're probably hitting a very common bug, where the browser doesn't create valid media with the MediaRecorder when outputting chunks. – Brad Nov 22 '18 at 03:03
  • Is there any workaround? I'm trying to avoid using the server to keep it all client side... – nick Nov 22 '18 at 03:07
  • You could let the MediaRecorder buffer your data, and get one chunk when it's stopped. Alternatively, try a different format. I think this is fixed for WebM in Chrome, but I'm not sure. Last time I dealt with this, I re-muxed the media in-browser with some code I wrote. I'm sorry, I don't have access to that code anymore. – Brad Nov 22 '18 at 03:10
  • Ok, I tried console logging the audio and I saw that when I hit play it outputs `Promise -> pending` on the recorded one and `Promise -> resolved` on the other ones. Once it's resolved it works in the sequencer. Any ideas why it takes so much to resolve? – nick Nov 22 '18 at 03:19
  • What does your `addSound` function do? It sounds like you're trying to play the recorded file before it's actually done being recorded, which doesn't work with all formats. – Brad Nov 22 '18 at 03:22
  • the addSound does this: `var audio = new Audio(url); audios.push({audio: audio, etc...})`. If after creating the Audio element I do this: `audio.play()` it works. That's what throws me off... – nick Nov 22 '18 at 03:29
  • Put your `addSound` in the `stop` handler instead. Don't just call it right away. – Brad Nov 22 '18 at 03:30
  • Please include a Minimal but **Complete** example. Here there is just no link between your two code blocks. We can't possibly guess what's wrong. For me that all sounds like a *"return from async"* problem, but since we have no clue how your code is setup we can't say for sure. – Kaiido Nov 22 '18 at 03:49
  • And @Brad that's not a bug, that's how the MediaRecorder API has been designed, the chunks you get from the dataavailable events are just *chunks* of the whole file that is being recorded. So it's normal you don't get the headers at every chunks. – Kaiido Nov 22 '18 at 03:52
  • @Kaiido I don't have time right now to post the complete working example but I'll post it later in addition to the workaround I designed. Can you explain what a "return from async" problem is and why if I play it right after creating the audio element it works but if I don't I have to wait an amount of time until the promise resolves? It doesn't make sense to me but maybe it's like that for some purpose... – nick Nov 22 '18 at 03:54
  • @Kaiido I understand that part, but there are legitimate bugs where when you concatenate all the chunks together, you get a file that's unplayable by the browser. (Playable in other things like VLC.) This occurs even with WebM, which can be streamed and doesn't require the initial chunks to be updated later. I'll see if I can find the Chromium bug tracker for it... – Brad Nov 22 '18 at 03:59
  • @Brad yes that's the length problem. I know the bug report, I took part in it. But that's not what I understood you were talking about in your first comment. – Kaiido Nov 22 '18 at 04:01
  • @nick I am talking about https://stackoverflow.com/questions/6847697/how-to-return-value-from-an-asynchronous-callback-function – Kaiido Nov 22 '18 at 04:03

0 Answers0