1

I'd like to be able to capture my desktop using FFmpeg and send the video through a WebSocket to the client where the video stream can be played in an HTML5 video tag. It seems that the current way to accomplish this is by sending JPG data and placing it on a canvas element. I would rather stream actual video and audio data before resorting to that.

Below is my server code. Note that I am running Ruby 2.2.0 on Linux(Debian Stable).

require 'bundler'
Bundler.require

EM.run do
  EM::WebSocket.run host: "0.0.0.0", port: 9393 do |ws|

    cmd = "ffmpeg -y -f x11grab -s 1600x900 -r 15 -i :0.0 -tune fastdecode -b:v 150k -threads 4 -f webm -"
    @handler = EM.popen3(cmd, stdout: Proc.new { |data| 
      puts(data)
      ws.send(Base64.encode64(data))
    }, stderr: Proc.new{|err|})

    @handler.callback do
      puts "hey"
    end

    @handler.errback do |err_code|
      puts err_code
    end

    ws.onopen do |handshake|
      puts "WebSocket connection open"
    end

    ws.onclose do
      puts "Connection closed"
      # @handler.kill('TERM', true)
    end

  end

  class SimpleView < Sinatra::Base
    set :public_folder, './'
    configure do
      set :threaded, false
    end
    get '/' do
      send_file
    end
  end

  EM.run do
    Rack::Server.start({
      app:    Rack::Builder.app{map('/'){ run SimpleView.new }},
      server: 'thin',
      Host:   '0.0.0.0',
      Port:   '8181'
    })
  end

end

Here is the client code(JavaScript):

var stream = new WebSocket('ws://localhost:9393')
var videoElement = document.querySelector("#desktop")
var videoSource = document.querySelector("source")
window.MediaSource = window.MediaSource || window.WebKitMediaSource;
var mediaSource = new MediaSource()
videoElement.src = window.URL.createObjectURL(mediaSource)

stream.onopen = function(){
  console.log('connection open')
}

stream.onclose = function(){
  console.log('connection closed')
}

mediaSource.addEventListener('sourceopen', function(e){
  var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8,vorbis"')

  stream.onmessage = function(e){
    var byteCharacters = atob(e.data)

    var byteNumbers = new Array(byteCharacters.length)
    for (var i = 0; i < byteCharacters.length; i++) {
      byteNumbers[i] = byteCharacters.charCodeAt(i)
    }

    var byteArray = new Uint8Array(byteNumbers)

    sourceBuffer.appendStream(byteArray)

  }

}, false)

What's happening is I'm capturing the stdout of FFmpeg, converting each chunk of data to base64, sending it through the websocket, then the client tries to decode the base64 chunk into something a source buffer can understand and play the mediasource in the video element.

Though it doesn't surprise me that this doesn't work, I would still like it to and I'm hoping maybe there's just one little thing that I'm missing. I get nothing except black inside the video element.

NOTE: I am using the original FFmpeg, and NOT Avconv. I compiled it with all codecs enabled.

The full source is available at https://github.com/Ravenstine/simpleview.

Ten Bitcomb
  • 2,316
  • 1
  • 25
  • 39
  • Since it looks like what you really want is live streaming from nodejs to an HTML5 browser, I'd suggest you read these articles: [Best Approach to Real Time Streaming](http://stackoverflow.com/questions/21921790/best-approach-to-real-time-http-streaming-to-html5-video-client) and [Stream live WebM video to browser using Node.js and GStreamer](https://delog.wordpress.com/2011/04/26/stream-live-webm-video-to-browser-using-node-js-and-gstreamer/) – jfriend00 Feb 05 '15 at 01:49
  • Thank you for the links, though those implementations stream video through a normal HTTP response. I am considering that, however, especially since I've had success doing that with other projects. – Ten Bitcomb Feb 05 '15 at 02:23
  • As best I know, a video control doesn't know how to read data from a webSocket so if you want to stream directly into a video control, I think you're going to have a much simpler path by using HTTP. There's no particular reason to use a webSocket here as best I can tell. – jfriend00 Feb 05 '15 at 03:35
  • @jfriend00 the 2nd link is broken. The [working link](https://tewarid.github.io/2011/04/26/stream-live-webm-video-to-browser-using-node.js-and-gstreamer.html) can be found in his new blog. – Amit Beckenstein Aug 04 '19 at 11:55
  • @AmitB. - I can't edit an old comment so your link will have to do. – jfriend00 Aug 04 '19 at 20:35

0 Answers0