1

Following Android Camera Capture using FFmpeg and feed raw yuv frame to ffmpeg with timestamp

I successfully put raw frames of Android phone camera into a named pipe made by mkfifo, and used ffmpeg to process them and generated a video file.

But the problem is that by ffmpeg, the encoding is very slow, it can only process 3~5 frames per second. The reason I have to use ffmpeg instead of MediaRecorder is that later I need to use ffmpeg to generate HLS segments and m3u8 file.

So I have to turn to use native encoder like MediaRecorder and try to set its OutputFile to a named pipe following How to use unix pipes in Android

My code likes this,

private String pipe = "/data/data/com.example/v_pipe1";
...
mMediaRecorder.setOutputFile(pipe);
...
mMediaRecorder.start();

Also I have a ffmpeg thread to use this pipe as the input.

But when I call mMediaRecorder.start(); It will throw java.lang.IllegalStateException. I've tried to put ffmpeg thread before or after calling mMediaRecorder.start(), but with the same error.

I have no idea about this now. Could someone tell me how to solve this? Any suggestions are welcome and appreciated. Thanks.

Community
  • 1
  • 1
Harrison
  • 313
  • 3
  • 15
  • I've also tried to set OutputFile of MediaRecorder and input of ffmepg to ParcelFileDescriptor socket or datagramsocket, didn't work. It would be great appreciated if someone could give me some examples about this if it's possible. – Harrison Apr 14 '17 at 00:42

0 Answers0