1

I have a game which can create some screenshots, and I want to transform them to mp4 video. So I've the next command:

ffmpeg -framerate 15 -i %06d.png -s hd1080 -vcodec libx264 -r 30 timelapse.mp4

But my game lasts 8h, so, after have auto-compress pictures, I've more than 9To of pictures. So I want to start the ffmpeg process before the end of pictures generation, so I want that ffmpeg wait the next picture to digest it.

How can I do it?

Jason Aller
  • 3,541
  • 28
  • 38
  • 38
Chklang
  • 857
  • 1
  • 8
  • 17
  • 1
    I don't think it's possible using command line, but it's simple to do using a Python script. The suggested solution writes the frames to FFmpeg stdin pipe as described [here](http://zulko.github.io/blog/2013/09/27/read-and-write-video-frames-in-python-using-ffmpeg/). In case you don't know Python, it's possible to implement in other programming languages. – Rotem Aug 20 '21 at 08:31
  • I confess that I disslike python because it's a big disaster for ecology, but thanks for the image2pipe and "-i -"!!! I will try it and i will post the solution in next days :D – Chklang Aug 20 '21 at 13:40
  • @Chklang You need to **clarify**:... **(1)** `I have a game which can create some screenshots` did you make the game or can you modify the source code to also run an external process like FFmpeg? **(2)** Sounds like you need to run FFmpeg as process to send images as video frames on availability... Do you know any programming languages that can also use **std in/out** on external programs? PHP can do it if you have local host server. **(3)** You say _"...But my game lasts 8hrs"_ so how often does it save pictures? Is this automatic or only when you press a button? – VC.One Aug 25 '21 at 18:59
  • Sorry @Rotem, you have given the best solution, but not as "answer", only as "comment", so i can't give you the bounty :/ – Chklang Aug 26 '21 at 13:34

3 Answers3

2

Batch process them

If you don't need the video absolutely immediately after the game, try batch processing.

Create videos periodically, maybe every 5 to 30 minutes, using your fresh pictures. Then combine all of the videos when ready. Concatenating a bunch of videos of the same format without recenocding in ffmpeg is very fast.

This answer has a great overview of the methods for combining videos in ffmpeg.

In your case, you can append the output file names into a text file as each video clip is created. For example, vidlist.txt:

file '/path/to/clip1.mp4'
file '/path/to/clip2.mp4'
file '/path/to/clip3.mp4'

Then use the command:

ffmpeg -f concat -safe 0 -i vidlist.txt -c copy output.mp4

One disadvantage is that you effectively double your disk space usage while creating the final video. You are still able to delete all clips immediately after combining, ultimately using the same amount of space. Alternatively, you could concatenate each video clip onto the main clip as you go, instead of waiting until the end. No additional space would be needed.

You may also be able to pipe images to ffmpeg with a script that feeds occasional input for 8 hours. But depending on how quickly after the game you need the video (likely seconds for a piped script script versus a minute or so for batched), batch processing and combining may be a simpler solution.

If this is running on the same hardware as the game, a constant script approach may still be better as the cpu usage should remain lower than the peaks created by batch processing every few minutes, although this can be mitigated somewhat by restricting threads for ffmpeg.

camtech
  • 308
  • 1
  • 8
  • Instead of waiting until the end and concatenating everything as one, I suppose you can just concatenate each video clip onto the main video as they complete, avoiding the need for potentially large amounts of extra when compiling at the end. I don't believe there is any disadvantage to concatenating multiple times as there is no re-encode. Edited answer to include this. – camtech Sep 03 '21 at 08:19
2

Regards to the stackoverflow community and Rotem for the first solution.

The solution was to use an FFmpeg Pipe (via Standard Input/Output).
Where -f image2pipe sets the expected input's format, and -i - starts watching the pipe for some incoming media data.

I used Java for the Standard I/O part but any programming language with that feature can apply the same solution.

I've created a repository on Github to do it : https://github.com/Chklang/ffmpeg-digester

And to execute it it's very easy :

java -jar ffmpeg-digester.jar --ffmpeg ffmpeg.exe --input ./pictures --output myVideo.mkv --codec libx264

(Some integrated help with "--help" / "-h")

With this program i've generated a h265 video, 51min, 54Go, 8k for resolution, from 74.000 pictures of each 25-35Mo in two days.

The result : https://youtu.be/Iqsq7irgUEQ

VC.One
  • 14,790
  • 4
  • 25
  • 57
Chklang
  • 857
  • 1
  • 8
  • 17
1

Here is a script that'll generate a stream of pictures of a rotating wizard for testing:

i=0; (while true; do convert magick:logo -rotate $i $i.jpg; i=`expr $i + 1`; sleep 1; done )

(Edit: Note convert is part of ImageMagick.) So start that off to emulate the image process, then start this off in another terminal to generate the video:

(while true; do \
    ls *.jpg | \
    sort -n | # All images sorted numerically \ 
    ( while read im; do # For each image \ 
        convert $im -resize 800x600\! RGBA:-; # Convert each image, output it to stdout
        rm $im; # Then delete
    done); 
    sleep 1; 
done ) | # Then pipe all raw images to ffmpeg \ 
ffmpeg -y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - out_vid.avi

The important part is the resize to ensure it is exactly the right size for ffmpeg, and RGBA:- to make sure it's a raw uncompressed stream. It will output a 25fps video, but this is irrespective of the rate pictures are generated.

Script is working in bash, but something equivalent will work fine in any modern OS' shell.

It will clear out the jpg images as they get processed, so as long as you've got enough processing power it will keep disk space below the final video size.

More info on the ffmpeg side available here: is it possible to send ffmpeg images by using pipe? . (I just took that answer and added the real-time image handling.)