I am developing a recording service for a custom Android platform. When the application starts it will start recording a video in the background. Unfortunately this application runs on hardware that prevents me from using video recording. My solution to this problem is to take images and hold them in a circular buffer, when an event happens it will stop feeding images to the buffer and place them together in a video.
The problem I am encountering is that when I save the images to video I just get a noisy green screen.
I based my code on this example: Using MediaCodec to save series of images as Video
Note: I cannot use MediaMux either, I am developing for API level <18.
I will guide your through the steps I take. On creation of the service I simply open the camera, I set the preview on a SurfaceTexture and I will add images to my buffer when the PreviewCallback is called.
private Camera mCamera;
private String mTimeStamp;
SurfaceTexture mSurfaceTexture;
private CircularBuffer<ByteArrayOutputStream> mCircularBuffer;
private static final int MAX_BUFFER_SIZE = 200;
private int mWidth = 720;
private int mHeight = 480;
@Override
public void onCreate() {
try {
mCircularBuffer = new CircularBuffer(MAX_BUFFER_SIZE);
mTimeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
mSurfaceTexture = new SurfaceTexture(10);
mCamera = getCameraInstance();
Parameters parameters = mCamera.getParameters();
parameters.setJpegQuality(20);
parameters.setPictureSize(mWidth, mHeight);
mCamera.setParameters(parameters);
mCamera.setPreviewTexture(mSurfaceTexture);
mCamera.startPreview();
mCamera.setPreviewCallback(mPreviewCallback);
} catch (IOException e) {
Log.d(TAG, "IOException: " + e.getMessage());
} catch (Exception e) {
Log.d(TAG, "Exception: " + e.getMessage());
}
}
private PreviewCallback mPreviewCallback = new PreviewCallback() {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, mWidth, mHeight, null);
Rect rectangle = new Rect(0, 0, mWidth, mHeight);
yuvImage.compressToJpeg(rectangle, 20, out);
mCircularBuffer.add(out);
}
};
All of this works, when I convert the byte arrays to jpg at this point they are all correct image files.
Now when an event happens, the service will be destroyed and the last 200 images will need to be placed behind each other and converted to mp4. I do this by first saving it to H264, based on the code provided in the link above. And then converting that file to mp4 by using mp4parser.
@Override
public void onDestroy() {
super.onDestroy();
mCamera.stopPreview();
saveFileToH264("video/avc");
convertH264ToMP4();
}
private void saveFileToH264(String MIMETYPE) {
MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
MediaFormat mediaFormat = null;
int height = mCamera.getParameters().getPictureSize().height;
int width = mCamera.getParameters().getPictureSize().width;
Log.d(TAG, height + ", " + width);
mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
boolean sawInputEOS = false;
int inputBufferIndex = -1, outputBufferIndex = -1;
BufferInfo info = null;
try {
File file = new File("/sdcard/output.h264");
FileOutputStream fstream2 = new FileOutputStream(file);
DataOutputStream dos = new DataOutputStream(fstream2);
// loop through buffer and get image output streams
for (int i = 0; i < MAX_BUFFER_SIZE; i++) {
ByteArrayOutputStream out = mCircularBuffer.getData(i);
byte[] dat = out.toByteArray();
long WAITTIME = 50;
inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
int bytesread = MAX_BUFFER_SIZE - 1 - i;
int presentationTime = 0;
if (bytesread <= 0)
sawInputEOS = true;
if (inputBufferIndex >= 0) {
if (!sawInputEOS) {
int samplesiz = dat.length;
inputBuffers[inputBufferIndex].put(dat);
codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
presentationTime += 100;
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
Log.i("BATA", "outputBufferIndex=" + outputBufferIndex);
if (outputBufferIndex >= 0) {
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if (array != null) {
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
if (sawInputEOS)
break;
}
} else {
codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime,
MediaCodec.BUFFER_FLAG_END_OF_STREAM);
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
if (outputBufferIndex >= 0) {
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if (array != null) {
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
break;
}
}
}
}
codec.flush();
try {
fstream2.close();
dos.flush();
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
codec.stop();
codec.release();
codec = null;
} catch (FileNotFoundException e) {
Log.d(TAG, "File not found: " + e.getMessage());
} catch (Exception e) {
Log.d(TAG, "Exception: " + e.getMessage());
}
}
private void convertH264ToMP4() {
try {
DataSource videoFile = new FileDataSourceImpl("/sdcard/output.h264");
H264TrackImpl h264Track = new H264TrackImpl(videoFile, "eng", 5, 1);
// 5fps. you can play with timescale and timetick to get non integer fps, 23.967 is
// 24000/1001
Movie movie = new Movie();
movie.addTrack(h264Track);
Container out = new DefaultMp4Builder().build(movie);
FileOutputStream fos = new FileOutputStream(new File("/sdcard/output.mp4"));
out.writeContainer(fos.getChannel());
fos.flush();
fos.close();
Log.d(TAG, "Video saved to sdcard");
} catch (Exception e) {
Log.d(TAG, "No file was saved");
}
}
I'm pretty sure the problem is in the saveFileToH264 code. I've read a post, on the link provided above, that this is probably a stride and/or alignment issue(?). I have however no experience with encoding/decoding so I'm not sure how to solve this issue. If anyone could help that would be greatly appreciated!
Note: I know the code is not optimal and I still need to add more checks and whatnot, but I first want to get a working video out of this.