I want to stream a real-time video flux that come from udp into a HTML video
tag.
I made some research but I got a lot of informations and I struggle to have a clear overview of what I can do and what I can't.
The video flux use H.264 and AAC codecs, MP4 container and has a 3840x2160 (4K) resolution. I'd like to play it on Chrome (latest version).
As I understand from now, HTML video
tag can natively read H.264/AAC videos. I made it work with the video direclty on my server (I'm using Meteor JS + React).
I learnt to use FFmpeg to stream an udp flux read by VLC player, and then I used FFserver (I know it's deprecated) to create an HTTP flux also read by VLC but not by the HTML video
tag.
So... my question is : is HTML video
can natively read video stream from HTTP ?
I've seen a lot of discussions about HLS and DASH, but I didn't understand if (and why) they're mandatory.
I read a post about someone creating a HLS m3u8
using only FFmpeg, is it a viable solution ?
FFserver configuration
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 20
MaxClients 10
MaxBandwidth 100000
<Feed feed.ffm>
File /tmp/feed.ffm
FileMaxSize 1g
ACL allow 127.0.0.1
</Feed>
<Stream stream.mpeg>
Feed feed.ffm
Format mpeg
AudioCodec aac
AudioBitRate 256
AudioChannels 1
VideoCodec libx264
VideoBitRate 10000 // Total random here
VideoBitRateRange 5000-15000 // And here...
VideoFrameRate 30
VideoQMin 1
VideoQMax 50
VideoSize 3840x2160
VideoBufferSize 20000 // Not sure either
AVOptionVideo flags +global_header
</Stream>
I had to specify QMin and QMax to avoid error message but I don't really understand what is it.
FFmpeg command line
ffmpeg -re -i bbb_sunflower_2160p_30fps_normal.mp4 -strict -2 -r 30 -vcodec libx264 http://localhost:8090/feed.ffm
This work with VLC. I'm working with a file on my computer before moving to an udp stream.