Am tryin' to take the preview frame generated by the android camera and pass the data[]
to ffmpeg input pipe to generate a flv video.
The command that I used was :
ffmpeg -f image2pipe -i pipe: -f flv -vcodec libx264 out.flv
I've also tried to force the input format to yuv4mpegpipe
and rawvideo
but with no success...
The default format of the preview frame generated by android-camera is NV21
.
The way am invokin' ffmpeg is through the Process API
and writing the preview frames data[]
to the process' stdin
...
The onPreviewFrame()
definition is as follows :
public void onPreviewFrame(byte[] data, Camera camera)
{
try
{
processIn.write(data);
}
catch(Exception e)
{
Log.e(TAG, FUNCTION + " : " + e.getMessage());
}
camera.addCallbackBuffer(new byte[bufferSize]);
}
processIn
is connected to the ffmpeg
process stdin
and buffersize
is computed based on the documentation provided for addCallbackBuffer()
.
Is there something that am doin' wrong...?
Thanks...