3

I have org.webrtc.VideoFrame frames stream.

The frames are coming one by one. Is there any lib or tool to convert frames to stream on a fly?

I could successfully take convert that VideoFrames to array of bytes. (similar as this question that uses it for image) Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

I would like to create playable stream video but when I try to play the one create with FileOutputStream or any other stream that can be passed to FFMPEG for example it is not playable, so seems it needs muxer to create it?

private fun addMetaDataToVideo() {
   val file = File(context.getExternalFilesDir(Environment.DIRECTORY_DCIM).toString() + "/" 
   +"KYR", "${videoNamePrefix}}.mp4")
   val out = FileOutputStream(file)
   listOfFrames.forEach { out.write(it) }
   out.close()
   addMetaDataToVideo(file)
}


   private fun addMetaDataToVideo(videoFile: File) {
       val values = ContentValues(3)
       values.put(MediaStore.Video.Media.MIME_TYPE, "video/mp4")
      // values.put(MediaStore.Video.Media.DURATION, getVideoDuration(videoFile))
       values.put(MediaStore.Video.Media.DATA, videoFile.absolutePath)
       context.contentResolver.insert(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, values)
   }

Kyryl Zotov
  • 1,788
  • 5
  • 24
  • 44

1 Answers1

2

Using ffmpeg is the most common way to do it. Can you put your frames into some sort of image aray? BufferedImage or something like it? Once you have your images you can use the following line. Like explained here: https://hamelot.io/visualization/using-ffmpeg-to-convert-a-set-of-images-into-a-video/

Try to write your frames into a BufferedImage,

ffmpeg -r 60 -f image2 -s 1920x1080 -i pic%04d.png -vcodec libx264 -crf 25  -pix_fmt yuv420p test.mp4

Eugen
  • 258
  • 1
  • 10
  • I agree that using ffmpeg is most common way but because its Android phone I want to try to combine ByteByffers as its YUV image. Because with ffmpeg I must make complex pipe that might not be efficient and in the end take the same byte outcome as I work now – Kyryl Zotov Jun 25 '20 at 07:07
  • So ffmpeg is good example for server or big system, but my need is to operate with a frames on a fly – Kyryl Zotov Jun 25 '20 at 07:08
  • So your priority is to redirect your incoming stream to a new output stream, without delays? I think you will find here a working solution for ffmpeg: https://trac.ffmpeg.org/wiki/StreamingGuide – Eugen Jun 25 '20 at 09:37
  • Another idea would be to use 2 threads, one for your incoming frames and one for your output, letting them join after every new frame and use the images (of the frames) to output stream ffmpeg conversion method. This way after each new image which ffmpeg streams to the output, the thread which runs the ffmpeg method will wait for your input threads next image, where the two threads join again and proceed further. – Eugen Jun 25 '20 at 09:49
  • For best performance use a queue data structure, on which the input stream ads every new frame at left end of your queue data structure and your output stream pulls it's new frames from the right end of the queue structure, this way you only need to join the two threads if the queue gets empty, in which case the output thread will have to wait for the join. – Eugen Jun 25 '20 at 09:58