0

Is there a way I can send live audio input from browser to an Icecast server?

I am using getUserMedia API to receive the audio input and I want this to be a live stream to an Icecast server.

getAudioInput(){
  const constraints = { 
    video: false, 
    audio: {deviceId: this.state.deviceId ? {exact: this.state.deviceId} : undefined},
  };

  window.navigator.getUserMedia(
    constraints, 
    this.streamAudio, 
    this.handleError
  );
}

In my streamAudio function, I want to stream this to the Icecast server. Can I do this with some sort of xmlhttprequest or does it need to be done over socket?

Stretch0
  • 8,362
  • 13
  • 71
  • 133
  • I don't think there is a way to directly send (as a source client) from a browser. At least I haven't heard of anyone doing this successfully. It might very well be possible with some tweaks. – TBR Jun 05 '18 at 13:39
  • @TBR but this can be done in other applications / software right? Are you saying restriction in Javascript doesn't allow streaming to Icecast? I may be wrong but it looks as though it is possible in this webcaster client: https://webcast.github.io/webcaster/ – Stretch0 Jun 05 '18 at 13:50
  • OK, looks like someone made it work by using a websocket (which happens to be very similar in behaviour to what we always were doing in Icecast) – TBR Jun 05 '18 at 13:55
  • @TBR what are the limitations of Javascript that make it so difficult to stream live audio to an Icecast server? What language should I build a client in and what features make it better for the job? – Stretch0 Jun 05 '18 at 18:42

1 Answers1

2

Unfortunately, this isn't directly possible today. See also: Fetch with ReadableStream as Request Body

Basically, browsers don't allow a streamable HTTP request body. Therefore, you can't do a long-running HTTP PUT with data generated on the fly. The request body has to be fully resolved before the request is sent.

The specifications around Fetch and the Streams interface in-browser state that it should be possible to use a stream as a request body, but no browsers implement it today.

There are only two ways to get streams out of browsers today. One of which is to use Web Sockets. This is the easiest method, and requires you to handle the encoding of your media data (usually through the MediaRecorder API). The second method is to use WebRTC. With WebRTC, you can either use its MediaStream handling directly (difficult to do server-side), or use its data streams. There is no real benefit to using the data streams vs. Web Sockets if you're just sending data directly to a server.

I've built web-based clients in the past which use the WebSocket method. See also: https://stackoverflow.com/a/40073233/362536

Brad
  • 159,648
  • 54
  • 349
  • 530
  • I thought that might be the case. I will go down the route of websockets – Stretch0 Jun 06 '18 at 16:14
  • 1
    @Stretch0 Consider also lobbying W3C and browser developers in their issue trackers and such. :-) I think having a proper HTTP client in a browser is an important thing for many use cases, but it probably is never asked for because people assume browsers can't do such useful things. Therefore, it never seems to be a priority. We're *so close* to having this capability. Anyway, let me know if you're interested in licensing my code which already solves your need. brad@audiopump.co – Brad Jun 06 '18 at 16:16
  • As you mentioned in your post you linked to, it seems to be best to convert buffer to int16 buffer. I am doing this and emitting to my socket server which is just relaying back out to clients. I explain it in more details here which you may be able to assist with aswell https://stackoverflow.com/questions/50532474/how-to-create-a-live-media-stream-with-javascript – Stretch0 Jun 06 '18 at 16:20
  • Try webrtc api. – Chris P Aug 31 '20 at 19:19
  • Could the browser send content in chunks? I'm thinking of techniques like HLS. – sleblanc Nov 18 '20 at 16:53