14

I would like to wrap real time encoded data to webm or ogv and send it to an html5 browser.

Can webm or ogv do this, Mp4 can not do this due to its MDAT atoms. (one can not wrap h264 and mp3 in real time and wrap it and send it to the client) Say I am feeding the input from my webcam and audio from my built in mic. Fragmented mp4 can handle this but its an hassle to find libs to do that).

I need to do this cuz I do not want to send audio and video separably.

If I did send it separably, sending audio over audio tag and video over video>(audio and video are demuxed and sent) Can I sync them on client browser with javascript. I saw some examples but not sure yet.

Evren Bingøl
  • 1,306
  • 1
  • 20
  • 32
  • I use the Stream-m server to relay webm streams to the client HTML5 video tags. https://github.com/yomguy/stream-m Works well in production. Cheers EDIT: Note that IceCast can now also stream WebM out of the box ;) – yomguy Nov 07 '13 at 22:50

3 Answers3

12

I did this with ffmpeg/ffserver running on Ubuntu as follows for webm (mp4 and ogg are a bit easier, and should work in a similar manner from the same server, but you should use all 3 formats for compatibility across browsers).

First, build ffmpeg from source to include the libvpx drivers (even if your using a version that has it, you need the newest ones (as of this month) to stream webm because they just did add the functionality to include global headers). I did this on an Ubuntu server and desktop, and this guide showed me how - instructions for other OSes can be found here.

Once you've gotten the appropriate version of ffmpeg/ffserver you can set them up for streaming, in my case this was done as follows.

On the video capture device:

ffmpeg -f video4linux2 -standard ntsc -i /dev/video0 http://<server_ip>:8090/0.ffm
  • The "-f video4linux2 -standard ntsc -i /dev/video0" portion of that may change depending on your input source (mine is for a video capture card).

Relevant ffserver.conf excerpt:

Port 8090
#BindAddress <server_ip>
MaxHTTPConnections 2000
MAXClients 100
MaxBandwidth 1000000
CustomLog /var/log/ffserver
NoDaemon

<Feed 0.ffm>
File /tmp/0.ffm
FileMaxSize 5M
ACL allow <feeder_ip>
</Feed>
<Feed 0_webm.ffm>
File /tmp/0_webm.ffm
FileMaxSize 5M
ACL allow localhost
</Feed>

<Stream 0.mpg>
Feed 0.ffm
Format mpeg1video
NoAudio
VideoFrameRate 25
VideoBitRate 256
VideoSize cif
VideoBufferSize 40
VideoGopSize 12
</Stream>
<Stream 0.webm>
Feed 0_webm.ffm
Format webm
NoAudio
VideoCodec libvpx
VideoSize 320x240
VideoFrameRate 24
AVOptionVideo flags +global_header
AVOptionVideo cpu-used 0
AVOptionVideo qmin 1
AVOptionVideo qmax 31
AVOptionVideo quality good
PreRoll 0
StartSendOnKey
VideoBitRate 500K
</Stream>

<Stream index.html>
Format status
ACL allow <client_low_ip> <client_high_ip>
</Stream>
  • Note this is configured for a server at feeder_ip to execute the aforementioned ffmpeg command, and for the server at server_ip so server to client_low_ip through client_high_ip while handling the mpeg to webm conversation on server_ip (continued below).

This ffmpeg command is executed on the machine previously referred to as server_ip (it handles the actual mpeg --> webm conversion and feeds it back into the ffserver on a different feed):

ffmpeg -i http://<server_ip>:8090/0.mpg -vcodec libvpx http://localhost:8090/0_webm.ffm

Once these have all been started up (first the ffserver, then the feeder_ip ffmpeg process then then the server_ip ffmpeg process) you should be able to access the live stream at http://:8090/0.webm and check the status at http://:8090/

Hope this helps.

CoryG
  • 2,429
  • 3
  • 25
  • 60
  • @EvrenBingøl, could you provide more info? – alfadog67 Dec 20 '16 at 20:48
  • This has been a while so this is what I remember; The client requests an ogv file from server which represents the realtime video. the data that is being sent is converted to ogv on the fly on the server using directX. So say you want to do do a video chat. You would send your video and audio data to server with its headers(whatever type that maybe) and directx converts that to ogv and sends it to receiving end. Ogv handles chunks very well. – Evren Bingøl Dec 22 '16 at 09:25
  • If anybody using this config, please update Port to HTTPPort, Address to HTTPAddress, remove NoDeamon – ThienSuBS May 15 '17 at 09:56
4

Evren,

Since you have asked this question initially, the Media Source Extensions https://www.w3.org/TR/media-source/ have matured enough to be able to play very short (30ms) ISO-BMFF video/mp4 segments with just a little buffering.

Refer to HTML5 live streaming

So your statement

(one can not wrap h264 and mp3 in real time and wrap it and send it to the client)

is out of date now. Yes you can do it with h264 + AAC.

There are several implementations out there; take a look at Unreal Media Server. From Unreal Media Server FAQ: http://umediaserver.net/umediaserver/faq.html

How is Unreal HTML5 live streaming different from MPEG-DASH? Unlike MPEG-DASH, Unreal Media Server uses a WebSocket protocol for live streaming to HTML5 MSE element in web browsers. This is much more efficient than fetching segments via HTTP requests per MPEG-DASH. Also, Unreal Media Server sends segments of minimal duration, as low as 30 ms. That allows for low, sub-second latency streaming, while MPEG-DASH, like other HTTP chunk-based live streaming protocols, cannot provide low latency live streaming.

Their demos webpage has a live HTML5 feed from RTSP camera: http://umediaserver.net/umediaserver/demos.html Notice that the latency in HTML5 player is comparable to that in Flash player.

Community
  • 1
  • 1
user1390208
  • 1,866
  • 20
  • 20
  • 2
    This reads like an advert for Unreal. "There are several implementations out there" yet the only suggestion is a single solution from Unreal that requires a license. Since this is a much more current answer than the previous best from 4 years ago, It'd be nice to see one of the several implementations be one that others can build. – JohnMetta Nov 01 '17 at 15:45
0

Not 100% sure you can do this. HTML5 has not ratified any live streaming mechanism. You could use websockets and send data in real time to the browser to do this. But you have to write the parsing logic yourself and I do not know how you will feed the data as it arrives to the player.

As for video and audio tag: Video tag can play container files that have both audio and video. So wrap your content in a container that is compatible. If you modify your browser to write your live streaming to this video file as the live content keeps coming in and stream out that data for every byte requested by the browser this could be done. But it is definitely non trivial.

av501
  • 6,645
  • 2
  • 23
  • 34
  • 1
    I need to read a wmv file . transcode it real time with rendering speed.(as if I am watching and not writing to a file) encoded to VP8 and wrap it in webm. Browser is going to point file.webm which is being transcoded. when request is made, i am going to long poll on the serverside. and I am going to write to that http socket(you can name it whatever, say "response") So there is one way communication, that is server is pushing the webm file to the browser. Browser is progressive downloading a file , at least it thinks that it is a file but it acutally is a livestream being wrapped in webm. – Evren Bingøl Sep 11 '12 at 21:37
  • 2
    Doable. Simpler to perhaps write to a file directly and serve the file off a webserver. Use the webservers throttling set to the target bitrate to ensure it is served in "real time" and not faster or slower. Give the transcoding some head start to get some room. Should work. The throttling is important so that the player cannot pull faster than the speed at which you are producing (the target bitrate) – av501 Sep 11 '12 at 21:50