9

I want to send images as input to ffmpeg and I want ffmpeg to output video to a stream (webRtc format.)

I found some information that from my understanding showed this is possible. - I believe that ffmpeg could receive image from a pipe, does anyone know how this can be done ?

Evan Edwards
  • 182
  • 11
Yanshof
  • 9,659
  • 21
  • 95
  • 195

1 Answers1

12

"I want to send images as input to FFmpeg... I believe that FFmpeg could receive image from a pipe, does anyone know how this can be done?"

Yes it's possible to send FFmpeg images by using a pipe. Use the standardInput to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (widthxheightx3) to write a full frame.

Normally (in Command or Terminal window) you set input and output as:

ffmpeg -i inputvid.mp4 outputvid.mp4.

But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as -i - (where by using a blank - this means FFmpeg watches the standardInput connection for incoming raw pixel data.

You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :

public function makeVideoFrame ( frame_BMP:Bitmap ) : void
{
    //# Encodes the byte array of a Bitmap object as FFmpeg video frame
    if ( myProcess.running == true )
    {
        Frame_Bytes = frame_BMP.getBytes(); //# read pixel values to a byte array
        myProcess.standardInput.writeBytes(Frame_Bytes); //# Send data to FFmpeg for new frame encode

        Frame_Bytes.clear(); //# empty byte array for re-use with next frame

    }
}

Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg makeVideoFrame (my_new_frame_BMP);.

Your pipe's Process must start with these arguments:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - ....etc

Where...

  • -f rawvideo -pix_fmt argb means accept uncompressed RGB data.

  • -s 800x600 and -r 25 are example input width & height, r sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.

The full setup looks like this:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_vid.h264

If you get blocky video output try setting two output files...

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_tempData.h264 out_vid.h264

This will output a test h264 video file which you can later put inside an MP4 container.
The audio track -i someTrack.mp3 is optional.

-i myH264vid.h264 -i someTrack.mp3 outputVid.mp4
VC.One
  • 14,790
  • 4
  • 25
  • 57
  • Hello! Thank you for this wonderful answer. I'm facing an issue... I have followed all the instructions you have given, but I am getting a "pipe closed" error before I even write any data. Do you have any tips for me to follow/check? Thank you! – Sreenikethan I Jul 08 '19 at 05:01
  • What programming language says _"pipe closed"_? You have to be running Ffmpeg application as an external `Process` from within your code, that atomatialy makes a pipe (connection) between your code's functions & vars towards the FFmpeg app running from terminal (may give invisible window to user if running as `Process`). – VC.One Jul 08 '19 at 22:05
  • I used your command line suggestion and I get `x264 [error]: baseline profile doesn't support 4:4:4`. I think something else is needed if the input `-pix_fmt` is `argb`. – cheshirekow Nov 13 '19 at 17:22
  • You can't use colorspace YUV **444** in H264 (Mpeg), it should be **420**. Try adding pixel format after setting baseline, _eg_: `-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -pix_fmt yuv420p -level...etc` – VC.One Nov 13 '19 at 19:41
  • 1
    @VC.One Alright, I had fixed the error. I had meant to say that **FFMPEG** shows a "pipe closed" error, IIRC. I think I had set `ProcessStartInfo.UseShellExecute = False` and hence it allowed me to redirect the streams. – Sreenikethan I Mar 23 '20 at 06:24
  • 1
    Hi how would you do this for a dynamic frame rate? Meaning insert one frame for a duration of say 25 milliseconds, then there best Deanne with a duration of 50 milliseconds etc? – B''H Bi'ezras -- Boruch Hashem Sep 25 '20 at 21:10
  • @SreenikethanI Were you able to send raw data to FFmpeg? Mine always says "invalid date when processing input" and I had to use a named pipe to even be able to send something. Also I'm trying to define the timestamps of each frame as the person above. – Nicke Manarin Jul 30 '23 at 15:21
  • Problem solved. Wrong set of parameters sent to FFmpeg. https://stackoverflow.com/questions/76767604/pass-individual-frames-as-bgra-byte-array-and-set-the-timestamps-via-pipe-to-ffm/76796090#76796090 – Nicke Manarin Jul 31 '23 at 02:12