14

Please, is there any easy way to stream (broadcast) media file (ogg, mp3, spx..) from server to client (browser) via NODE.js and possibly SOCKET.IO?

I have to record audio input on the server side and then be able to play it realtime for many clients. I've been messing with binary.js or socket.io streams but wasnt able to get it right.

I've tried to encode audio input with speex, vorbis or lame and then load it by FS to client but I havent been successful. Or do i have to capture PCM and then decode it in browser?

Any suggestion on this, nothing Ive found ever helped me.

Many thanks for any tips, links and ideas.

ango
  • 361
  • 1
  • 3
  • 6
  • I'm also interested in live audio streaming. The more I'm reading about it, node streams, back-pressure, buffering and all the stuff you need to take care of, the less I know how to tackle that. There's a [nice post](http://stackoverflow.com/questions/21921790/best-approach-to-real-time-http-streaming-to-html5-video-client). I'm trying to avoid using SHOUTcast/Icecast but maybe that'd be the easiest way. – maxdec Aug 13 '14 at 17:49
  • [topic on MDN](https://developer.mozilla.org/en-US/Apps/Build/Audio_and_video_delivery/Live_streaming_web_audio_and_video) – maxdec Aug 13 '14 at 18:04
  • What is really important for me is to eliminate delay as much as possible. I don't care about quality. The audio/video html5 element by itself is delayed in all major browsers with everything Ive tried so far. So audiocontext and usermedia are probably the way. But still no success for me. – ango Aug 14 '14 at 14:48

3 Answers3

16

You'll want to look for packages that work on Streams and from there it's just about piping your streams to output as necessary. Using Express or just the built-in HTTP you can accomplish this quite easily. Here's an example built around osx-audio which provides a PCM stream, lame which can encode a stream to mp3, and Express:

var Webcast = function(options) {

  var lame = require('lame');
  var audio = require('osx-audio');
  var fs = require('fs');

  // create the Encoder instance
  var encoder = new lame.Encoder({
    // input
    channels: 2,        // 2 channels (left and right)
    bitDepth: 16,       // 16-bit samples
    sampleRate: 44100,  // 44,100 Hz sample rate

    // output
    bitRate: options.bitrate,
    outSampleRate: options.samplerate,
    mode: (options.mono ? lame.MONO : lame.STEREO) // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
  });

  var input = new audio.Input();
  input.pipe(encoder);

  // set up an express app
  var express = require('express')
  var app = express()

  app.get('/stream.mp3', function (req, res) {
    res.set({
      'Content-Type': 'audio/mpeg3',
      'Transfer-Encoding': 'chunked'
    });
    encoder.pipe(res);
  });

  var server = app.listen(options.port);
}

module.exports = Webcast;

How you get your input stream might be the most interesting part, but that will depend on your implementation. The popular request package is built around Streams as well though, so it might just be an HTTP request away!

milkandtang
  • 576
  • 3
  • 6
  • you have and example of this with socket.io?, you have this on http: app.get('/stream.mp3', function (req, res) { res.set({ 'Content-Type': 'audio/mpeg3', 'Transfer-Encoding': 'chunked' }); encoder.pipe(res); }); and with socket.io ? something like this? socket.on('audio',function(data){ encoder.pipie(data); //where is the emit or send to the client? }); – cmarrero01 Mar 02 '15 at 20:17
  • Haven't tried this solution yet, but looks interesting. I'll let you know how this went. – AllJs Jul 30 '18 at 06:28
  • can anyone point me to some resources that could achieve this in Flask/Flask-SocketIO? – Josh Katofsky Feb 02 '21 at 03:20
5

On the web browser you have the HTML5 video element and the audio element. Both of them have sources. Each web browser supports different codecs natively. So you'll want to watch out for that if you're trying to stream mp3.

You don't need socket.io, you only need HTTP. Your app is reading a file, music.ogg, and for each chunk it reads, it will send it through the http server. It will be one single HTTP request that's kept open until the file is transferred.

Here's how your html will look:

<audio src="http://example.com/music.ogg"></audio>

And your nodejs code will be something like this (haven't tested this):

var http = require('http');
var fs = require('fs');

http.on('request', function(request, response) {
    var inputStream = fs.open('/path/to/music_file.ogg');
    inputStream.pipe(response);
})

I'm only using the ReadableStream.pipe method on the inputStream and the http and fs modules for the above code. If you want to transcode the audio file (for example, from mp3 to ogg) you'll want to find a module that does that and pipe the data from the file into the transcoder then into response:

// using some magical transcoder
inputStream.pipe(transcoder).pipe(response);

The method will call end on the stream whenever it's finished writing so that the HTTP request will be finished as soon as the file is done being read (and transcoded).

  • 5
    Thanks a lot for an answer, but what I really meant was to make something as "live stream" synchronised to all clients, so do you think is there any way to for example chop the audio source to parts (or use recorded pcms as source) and provide those parts to clients? After all, the WebRTC way is having really good results but isn't sustainable for more clients connected. – ango Jul 18 '14 at 15:56
  • 2
    You need to use `createReadStream` instead of `open` to pipe things. – wieczorek1990 Nov 15 '14 at 15:58
1

You can do this with node and RTC. There is some tools ready to use like SimpleWebRTC or EasyRTC. For what I already tested video is still a trouble, but audio works great.

Scoup
  • 1,323
  • 8
  • 11