I'm currently trying to use FFMPEG with Hardware/GPU Encoding with the H264 Codec.
What I do is, I pipe raw data direclty into ffmpeg to output them to a udp stream. Those are my settings:
var ffmpegArgs = [
'-c:v', 'rawvideo',// input container
'-f', 'rawvideo',
'-pix_fmt', 'rgba', // input pixel format
'-s', '600x600', //input size
'-video_size', '600x600',
'-i', 'pipe:0', // input source
'-f', 'mpegts', // output container format
'-s', '600x600',
'-video_size', '600x600',
'-c:v', 'libx264', // output video codec
'-b:v', '1m', // output bitrate
'udp://239.255.123.46:1234' // output destination
];
And in generally it is working, but with really miserable quality and latency. The frames are like 5 seconds behind and then have lots of bugs in them so it takes at least 10 or 15 seconds to see the hole frame (the video is a "live stream" from a canvas).
However I thought that GPU Encoding might help here, but I don't get this working. I'm trying to use VAAPI
, but no matter which command from ffmpeg I'm trying to use (descirbed here), it's not working....
I'm trying to run this on a Intel NUC (this one) on an Ubuntu 16.04.
Are there any tips on how I can get this running?