I'd like to be able to capture my desktop using FFmpeg and send the video through a WebSocket to the client where the video stream can be played in an HTML5 video tag. It seems that the current way to accomplish this is by sending JPG data and placing it on a canvas element. I would rather stream actual video and audio data before resorting to that.
Below is my server code. Note that I am running Ruby 2.2.0 on Linux(Debian Stable).
require 'bundler'
Bundler.require
EM.run do
EM::WebSocket.run host: "0.0.0.0", port: 9393 do |ws|
cmd = "ffmpeg -y -f x11grab -s 1600x900 -r 15 -i :0.0 -tune fastdecode -b:v 150k -threads 4 -f webm -"
@handler = EM.popen3(cmd, stdout: Proc.new { |data|
puts(data)
ws.send(Base64.encode64(data))
}, stderr: Proc.new{|err|})
@handler.callback do
puts "hey"
end
@handler.errback do |err_code|
puts err_code
end
ws.onopen do |handshake|
puts "WebSocket connection open"
end
ws.onclose do
puts "Connection closed"
# @handler.kill('TERM', true)
end
end
class SimpleView < Sinatra::Base
set :public_folder, './'
configure do
set :threaded, false
end
get '/' do
send_file
end
end
EM.run do
Rack::Server.start({
app: Rack::Builder.app{map('/'){ run SimpleView.new }},
server: 'thin',
Host: '0.0.0.0',
Port: '8181'
})
end
end
Here is the client code(JavaScript):
var stream = new WebSocket('ws://localhost:9393')
var videoElement = document.querySelector("#desktop")
var videoSource = document.querySelector("source")
window.MediaSource = window.MediaSource || window.WebKitMediaSource;
var mediaSource = new MediaSource()
videoElement.src = window.URL.createObjectURL(mediaSource)
stream.onopen = function(){
console.log('connection open')
}
stream.onclose = function(){
console.log('connection closed')
}
mediaSource.addEventListener('sourceopen', function(e){
var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8,vorbis"')
stream.onmessage = function(e){
var byteCharacters = atob(e.data)
var byteNumbers = new Array(byteCharacters.length)
for (var i = 0; i < byteCharacters.length; i++) {
byteNumbers[i] = byteCharacters.charCodeAt(i)
}
var byteArray = new Uint8Array(byteNumbers)
sourceBuffer.appendStream(byteArray)
}
}, false)
What's happening is I'm capturing the stdout of FFmpeg, converting each chunk of data to base64, sending it through the websocket, then the client tries to decode the base64 chunk into something a source buffer can understand and play the mediasource in the video element.
Though it doesn't surprise me that this doesn't work, I would still like it to and I'm hoping maybe there's just one little thing that I'm missing. I get nothing except black inside the video element.
NOTE: I am using the original FFmpeg, and NOT Avconv. I compiled it with all codecs enabled.
The full source is available at https://github.com/Ravenstine/simpleview.