5

I want to make a simple audio only stream over WebRTC, using Peer.js. I'm running the simple PeerServer locally.

The following works perfectly fine in Firefox 30, but I can't get it to work in Chrome 35. I would expect there was something wrong with the PeerJS setup, but Chrome -> Firefox works perfectly fine, while Chrome -> Chrome seems to send the stream, but won't play over speakers.

Setting up getUserMedia Note: uncommenting those lines below will let me hear the loopback in Chrome and Firefox.

navigator.getUserMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
window.AudioContext = window.AudioContext || window.webkitAudioContext;

if(navigator.getUserMedia) {
    navigator.getUserMedia({video: false, audio: true}, getMediaSuccess, getMediaError);
} else {
    alert('getUserMedia not supported.');
}

var localMediaStream;
//var audioContext = new AudioContext();

function getMediaSuccess(mediaStream) {
    //var microphone = audioContext.createMediaStreamSource(mediaStream);
    //microphone.connect(audioContext.destination);
    localMediaStream = mediaStream;
}

function getMediaError(err) {
    alert('getUserMedia error. See console.');
    console.error(err);
}

Making the connection

var peer = new Peer({host: '192.168.1.129', port: 9000});

peer.on('open', function(id) {
    console.log('My ID:', id);
});

peer.on('call', function(call) {
    console.log('answering call with', localMediaStream);
    call.answer(localMediaStream);
    //THIS WORKS IN CHROME, localMediaStream exists

    call.on('stream', function(stream) {
        console.log('streamRecieved', stream);
        //THIS WORKS IN CHROME, the stream has come through

        var audioContext = new AudioContext();
        var audioStream = audioContext.createMediaStreamSource(stream);
        audioStream.connect(audioContext.destination);
        //I HEAR AUDIO IN FIREFOX, BUT NOT CHROME

    });

    call.on('error', function(err) {
        console.log(err);
        //LOGS NO ERRORS
    });
});

function connect(id) {
    var voiceStream = peer.call(id, localMediaStream);
}
winduptoy
  • 5,366
  • 11
  • 49
  • 67

3 Answers3

10

This still appears to be an issue even in Chrome 73.

The solution that saved me for now is to also connect the media stream to a muted HTML audio element. This seems to make the stream work and audio starts flowing into the WebAudio nodes.

This would look something like:

let a = new Audio();
a.muted = true;
a.srcObject = stream;
a.addEventListener('canplaythrough', () => {
    a = null;
});

let audioStream = audioContext.createMediaStreamSource(stream);
audioStream.connect(audioContext.destination);

JSFiddle: https://jsfiddle.net/jmcker/4naq5ozc/


Original Chromium issue and workaround: https://bugs.chromium.org/p/chromium/issues/detail?id=121673#c121

New Chromium issue: https://bugs.chromium.org/p/chromium/issues/detail?id=687574 https://bugs.chromium.org/p/chromium/issues/detail?id=933677

jmcker
  • 387
  • 4
  • 18
  • 1
    This answer is definitely hacky, but completely harmless if the Chrome team ever does fix the issue. Worked perfectly, thank you! – aecend Mar 30 '20 at 13:02
  • 1
    The Chrome issue for this is [Issue 933677: MediaStream from RTC is silent for Web Audio API](https://bugs.chromium.org/p/chromium/issues/detail?id=933677). [#687574 AEC when using Web Audio API](https://bugs.chromium.org/p/chromium/issues/detail?id=687574) is about microphone echo cancellation not working when using Web Audio API (it only works with WebRTC, because AEC is implemented in libwebrtc). – bain May 11 '20 at 13:18
  • Wow. This still requires the workaround in Chrome 105. – jmcker Oct 08 '22 at 22:54
  • March 2023, still here – Vladimir Vlasov Mar 04 '23 at 15:59
6

In Chrome, it is a known bug currently where remote audio streams gathered from a peer connection are not accessible through the AudioAPI.

Latest comment on the bug:

We are working really hard towards the feature. The reason why this takes long time is that we need to move the APM to chrome first, implement a render mixer to get the unmixed data from WebRtc, then we can hook up the remote audio stream to webaudio.

It was recently patched in Firefox as I remember this being an issue on there as well in the past.

Benjamin Trent
  • 7,378
  • 3
  • 31
  • 41
  • 1
    Half a year past, still failed on Chrome M40 – Imskull Feb 12 '15 at 01:48
  • @Imskull, that sucks. Just checked the bug report and Justin Uberti says that they are only about a quarter of the way done and the team that has been working on it/will fix it are pulled away to other issues currently. From what I have read, it is a pretty serious design flaw and not just a coding bug. – Benjamin Trent Feb 12 '15 at 13:58
  • yes, it seems weird that google advocates webrtc everywhere w/o gets this issue to be fixed for 3 years. there are something under the hood. – Imskull Feb 13 '15 at 07:10
  • 1
    [solved](https://bugs.chromium.org/p/chromium/issues/detail?id=121673#c96) in Chrome Canary during December 2015 – ricricucit Mar 22 '16 at 13:25
-1

I was unable to play the stream using web audio but I did manage to play it uses a basic audio element:

 var audio = new Audio();                                                  
 audio.src = (URL || webkitURL || mozURL).createObjectURL(remoteStream);
 audio.play();